Top Banner
Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands B.V. @UvA May 22, 2019 Amsterdam
119

May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

Jul 04, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

Deep generative modeling

Jakub M. Tomczak

Deep Learning Researcher (Engineer, Staff)

Qualcomm AI Research

Qualcomm Technologies Netherlands B.V.

@UvAMay 22, 2019 Amsterdam

Page 2: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

2

Introduction

Page 3: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

3

Is generative modeling important?

Page 4: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

4

Is generative modeling important?

p(panda|x)=0.99

...

The neural network learns to classify images:

Page 5: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

5

Is generative modeling important?

p(panda|x)=0.99

...

noise

+ =

The neural network learns to classify images:

Page 6: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

6

Is generative modeling important?

p(panda|x)=0.99

...

noise p(panda|x)=0.01

p(dog|x)=0.9

+ =

The neural network learns to classify images:

Page 7: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

7

Is generative modeling important?

p(panda|x)=0.99

...

noise p(panda|x)=0.01

p(dog|x)=0.9

+ =

The neural network learns to classify images:

There is no semantic understanding of images.

Page 8: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

8

Is generative modeling important?

Page 9: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

9

Is generative modeling important?

Page 10: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

10

Is generative modeling important?

Page 11: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

11

Is generative modeling important?

new

data

Page 12: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

12

Is generative modeling important?

High probability

of a horse.

=

Highly

probable

decision!

new

data

Page 13: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

13

Is generative modeling important?

High probability

of a horse.

=

Highly

probable

decision!

High probability

of a horse.

x

Low probability

of the object

=

Uncertain

decision!

new

data

Page 14: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

14

Is generative modeling important?

High probability

of a horse.

=

Highly

probable

decision!

High probability

of a horse.

x

Low probability

of the object

=

Uncertain

decision!

new

data

Page 15: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

15

Where do we use generative modeling?

Image analysis

Reinforcement Learning

Audio analysis

Text analysis

Graph

analysis

and more...Active Learning

Medical data

Page 16: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

16

Generative modeling: How?

Generative

model

Autoregressive

(e.g., PixelCNN)

Implicit models

(e.g., GANs)

Prescribed models

(e.g., VAE)

Latent variable

models

Flow-based

(e.g., RealNVP, GLOW)

Page 17: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

17

Generative modeling: Pros and cons

Training Likelihood Sampling Compression

Autoregressive

models (e.g.,

PixelCNN)

Stable Yes Slow No

Flow-based models

(e.g., RealNVP)Stable Yes Fast/Slow No

Implicit models

(e.g., GANs)Unstable No Fast No

Prescribed models

(e.g., VAEs)Stable Approximate Fast Yes

Page 18: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

18

Generative modeling: Pros and cons

Training Likelihood Sampling Compression

Autoregressive

models (e.g.,

PixelCNN)

Stable Yes Slow No

Flow-based models

(e.g., RealNVP)Stable Yes Fast/Slow No

Implicit models

(e.g., GANs)Unstable No Fast No

Prescribed models

(e.g., VAEs)Stable Approximate Fast Yes

Page 19: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

19

Machine learning and (spherical) cows

Page 20: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

20

Machine learning and (spherical) cows

Page 21: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

21

Machine learning and (spherical) cows

Page 22: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

22

Machine learning and (spherical) cows

flow-based models latent variable models

Page 23: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

23

Deep latent

variable

models

Page 24: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

24

Generative modeling

Modeling in high-dimensional spaces is difficult.

Page 25: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

25

Generative modeling

Modeling in high-dimensional spaces is difficult.

Page 26: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

26

Generative modeling

Modeling in high-dimensional spaces is difficult.

➔ Modeling all dependencies among pixels:

Page 27: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

27

Generative modeling

Modeling in high-dimensional spaces is difficult.

➔ Modeling all dependencies among pixels:

problematic

Page 28: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

28

Generative modeling

Modeling in high-dimensional spaces is difficult.

➔ Modeling all dependencies among pixels:

A possible solution: Latent Variable Models!

problematic

Page 29: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

29

Generative process:

Generative modeling with Latent Variables

Page 30: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

30

Generative process:

Generative modeling with Latent Variables

Page 31: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

31

Generative process:

Generative modeling with Latent Variables

Page 32: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

32

Generative process:

Log of marginal distribution:

Generative modeling with Latent Variables

Page 33: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

33

Generative process:

Log of marginal distribution:

How to train such model efficiently?

Generative modeling with Latent Variables

Page 34: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

34

Variational inference for Latent Variable Models

Page 35: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

35

Variational inference for Latent Variable Models

Variational posterior

Page 36: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

36

Variational inference for Latent Variable Models

Jensen’s inequality

Page 37: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

37

Variational inference for Latent Variable Models

Reconstruction error Regularization

Page 38: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

38

Variational inference for Latent Variable Models

decoder

encoder

prior

Page 39: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

39

Variational inference for Latent Variable Models

decoder

encoder

prior

= Variational Auto-Encoder+ reparameterization trick

Page 40: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

40

Variational Auto-Encoders

encoder net decoder netcode

● VAE copies input to output through a bottleneck.

● VAE learns a code of the data.

Page 41: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

41

Variational Auto-Encoders

μ

σ

encoder net decoder netcode

● VAE copies input to output through a bottleneck.

● VAE learns a code of the data.

σ

Page 42: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

42

Variational Auto-Encoders

μ

σ

encoder net decoder netcode

● VAE copies input to output through a bottleneck.

● VAE learns a code of the data.

σ

Page 43: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

43

Variational Auto-Encoders

0

prior

decoder netcode

● VAE copies input to output through a bottleneck.

● VAE learns a code of the data.

● VAE puts a prior on the latent code.

● VAE can generate new data.

Page 44: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

44

Variational Auto-Encoders

0

prior

decoder netcode

● VAE copies input to output through a bottleneck.

● VAE learns a code of the data.

● VAE puts a prior on the latent code.

● VAE can generate new data.

Page 45: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

45

Variational Auto-Encoders

0

prior

decoder netcode

● VAE copies input to output through a bottleneck.

● VAE learns a code of the data.

● VAE puts a prior on the latent code.

● VAE can generate new data.

Page 46: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

46

Components of VAEs

Page 47: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

47

Components of VAEs

Resnets

DRAW

Autoregressive models

Normalizing flows

Autoregressive models

Normalizing flows

VampPrior

Implicit prior

Page 48: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

48

Components of VAEs

Normalizing flows

Discrete encoders

Hyperspherical dist.

Hyperbolic-normal dist.

Group theory

Resnets

DRAW

Autoregressive models

Normalizing flows

Autoregressive models

Normalizing flows

VampPrior

Implicit prior

Page 49: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

49

Components of VAEs

Normalizing flows

Discrete encoders

Hyperspherical dist.

Hyperbolic-normal dist.

Group theory

Resnets

DRAW

Autoregressive models

Normalizing flows

Autoregressive models

Normalizing flows

VampPrior

Implicit prior

Adversarial learning

MMD

Wasserstein AE

Page 50: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

50

Components of VAEs

Normalizing flows

Discrete encoders

Hyperspherical dist.

Hyperbolic-normal dist.

Group theory

Resnets

DRAW

Autoregressive models

Normalizing flows

Autoregressive models

Normalizing flows

VampPrior

Implicit prior

Adversarial learning

MMD

Wasserstein AE

Page 51: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

51

Variational posterior in VAEs

Question: How to minimize the

KL(q||p)?

In other words: How to formulate a

more flexible family of approximate

(variational) posteriors?

Using Gaussian is not sufficiently

flexible.

We need a computationally efficient

tool.

Arvanitidis, G., Hansen, L. K., & Hauberg, S. (2017). Latent space oddity: on the curvature of deep generative models. arXiv preprint arXiv:1710.11379.

Page 52: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

52

● Sample from a “simple” distribution:

Variational inference with normalizing flows

Rezende, D. J., & Mohamed, S. (2015). Variational inference with normalizing flows. ICML 2015

Page 53: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

53

Variational inference with normalizing flows

● Sample from a “simple” distribution:

● Apply a sequence of K invertible transformations:

Rezende, D. J., & Mohamed, S. (2015). Variational inference with normalizing flows. ICML 2015

0 0 0

...

Page 54: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

54

Variational inference with normalizing flows

● Sample from a “simple” distribution:

● Apply a sequence of K invertible transformations:

and the change of variables yields:

Rezende, D. J., & Mohamed, S. (2015). Variational inference with normalizing flows. ICML 2015

0 0 0

...

Page 55: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

55

Variational inference with normalizing flows

The learning objective (ELBO) with normalizing flows becomes:

Rezende, D. J., & Mohamed, S. (2015). Variational inference with normalizing flows. ICML 2015

Page 56: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

56

Variational inference with normalizing flows

The learning objective (ELBO) with normalizing flows becomes:

The difficulty lies in calculating the Jacobian determinant:

● Volume-preserving flows:

● General normalizing flows:

○ is “easy” to compute

Rezende, D. J., & Mohamed, S. (2015). Variational inference with normalizing flows. ICML 2015

Page 57: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

57

First, let us take a look at planar flows (Rezende & Mohamed, 2015):

This is equivalent to a residual layer with a single neuron.

Sylvester Normalizing Flows

= + +h

Page 58: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

58

First, let us take a look at planar flows (Rezende & Mohamed, 2015):

This is equivalent to a residual layer with a single neuron.

Can we calculate the Jacobian determinant efficiently?

Sylvester Normalizing Flows

= + +h

Page 59: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

59

We can use the matrix determinant lemma to get the Jacobian determinant:

which is linear wrt the number of z’s.

Sylvester Normalizing Flows

Page 60: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

60

We can use the matrix determinant lemma to get the Jacobian determinant:

which is linear wrt the number of z’s.

The bottleneck requires many steps, so how we can improve on that?

1. Can we generalize planar flows?

2. If yes, how can we compute the Jacobian determinant efficiently?

Sylvester Normalizing Flows

Page 61: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

61

We can control the bottleneck by generalizing u and w to A and B.

SNF: Generalizing Planar Flows

= + +h

Page 62: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

62

We can control the bottleneck by generalizing u and w to A and B.

How to calculate det of Jacobian?

SNF: Generalizing Planar Flows

= + +h

Page 63: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

63

We can control the bottleneck by generalizing u and w to A and B.

How to calculate det of Jacobian? Use Sylvester Determinant Identity:

SNF: Generalizing Planar Flows

= + +h

Page 64: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

64

We can control the bottleneck by generalizing u and w to A and B.

How to calculate det of Jacobian? Use Sylvester Determinant Identity:

OK, but it’s very expensive! Can we simplify these calculations?

SNF: Generalizing Planar Flows

= + +h

Page 65: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

65

Use of Sylvester Determinant Identity yields:

Next, we can use QR decomposition to represent A and B:

SNF: Generalizing Planar Flows

columns are orthonormal vectors

triangular matrices

Page 66: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

66

SNF: Invertible transformations

But is the proposed flow invertible in general?

Page 67: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

67

SNF: Invertible transformations

But is the proposed flow invertible in general? NO

Page 68: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

68

SNF: Invertible transformations

But is the proposed flow invertible in general? NO.

Theorem

If is smooth with bounded strictly positive derivative, and if

and , then

is invertible.

Hence:

1. For Q and R’s computing the Jacobian-determinant is efficient.

2. Restricting R’s results in invertible transformations.

Page 69: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

69

SNF: Invertible transformations

But is the proposed flow invertible in general? NO.

Theorem

If is smooth with bounded strictly positive derivative, and if

and , then

is invertible.

Hence:

1. For Q and R’s computing the Jacobian-determinant is efficient.

2. Restricting R’s results in invertible transformations.

But how to keep Q orthogonal?

Page 70: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

70

SNF: Learning orthogonal matrix

1. (O-SNF) Iterative orthogonalization procedure (e.g., Kovarik, 1970):

a. Repeat until convergence:

b. We can backpropagate through this procedure.

c. We can control the bottleneck by changing the number of columns.

2. (H-SNF) Use l Householder transformations to represent Q.

a. Then, SNF is a non-linear extension of the Householder flow.

b. No bottleneck!

3. (T-SNF) Alternate between identity matrix and a fixed permutation matrix.

a. It ensures that all elements of z are processed equally on average.

b. Used also in RealNVP and IAF.

Page 71: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

71

● A single step:

● Keep Q orthogonal:

○ With bottleneck: O-SNF.

○ No bottleneck: H-SNF, T-SNF.

Sylvester Normalizing Flows

Page 72: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

72

● A single step:

● Keep Q orthogonal:

○ With bottleneck: O-SNF.

○ No bottleneck: H-SNF, T-SNF.

● In order to increase flexibility, we can use hypernets to calculate Q and R’s:

Sylvester Normalizing Flows

= + +h

g

Page 73: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

73

SNF: Results on MNIST

Page 74: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

74

SNF: Results on other data

No. of flows: 16

IAF: 1280 wide MADE, no hypernets

Bottleneck in O-SNF: 32

No. of Householder transformations in H-SNF: 8

Page 75: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

75

Components of VAEs

Normalizing flows

Discrete encoders

Hyperspherical dist.

Hyperbolic-normal dist.

Group theory

Resnets

DRAW

Autoregressive models

Normalizing flows

Autoregressive models

Normalizing flows

VampPrior

Implicit prior

Adversarial learning

MMD

Wasserstein AE

Page 76: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

76

Geometric perspective on VAEs

Question: Is it possible to recover the

true Riemannian structure of the

latent space?

In other words: Will geodesics follow

data manifold?

Arvanitidis, G., Hansen, L. K., & Hauberg, S. (2017). Latent space oddity: on the curvature of deep generative models. arXiv preprint arXiv:1710.11379.

Page 77: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

77

Geometric perspective on VAEs

Question: Is it possible to recover the

true Riemannian structure of the

latent space?

In other words: Will geodesics follow

data manifold?

For Gaussian VAE: No.

Arvanitidis, G., Hansen, L. K., & Hauberg, S. (2017). Latent space oddity: on the curvature of deep generative models. arXiv preprint arXiv:1710.11379.

Page 78: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

78

Geometric perspective on VAEs

Question: Is it possible to recover the

true Riemannian structure of the

latent space?

In other words: Will geodesics follow

data manifold?

For Gaussian VAE: No.

We need a better notion of

uncertainty

Arvanitidis, G., Hansen, L. K., & Hauberg, S. (2017). Latent space oddity: on the curvature of deep generative models. arXiv preprint arXiv:1710.11379.

Page 79: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

79

Geometric perspective on VAEs

Question: Is it possible to recover the

true Riemannian structure of the

latent space?

In other words: Will geodesics follow

data manifold?

For Gaussian VAE: No.

We need a better notion of

uncertainty or different models.

Arvanitidis, G., Hansen, L. K., & Hauberg, S. (2017). Latent space oddity: on the curvature of deep generative models. arXiv preprint arXiv:1710.11379.

Page 80: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

80

Potential problems with Gaussians

In VAEs it is very often assumed that the posterior and the prior are Gaussians.

Page 81: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

81

Potential problems with Gaussians

In VAEs it is very often assumed that the posterior and the prior are Gaussians.

Page 82: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

82

Potential problems with Gaussians

In VAEs it is very often assumed that the posterior and the prior are Gaussians. But:

● The Gaussian prior is concentrated around the origin ⟶ possible bias.

Page 83: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

83

Potential problems with Gaussians

In VAEs it is very often assumed that the posterior and the prior are Gaussians. But:

● The Gaussian prior is concentrated around the origin ⟶ possible bias.

● In high-dim, the Gaussian concentrates on a hypersphere ⟶ ℓ2 norm fails.

Page 84: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

84

Using hyperspherical latent space

Since in high-dim the Gaussian distribution

concentrates on a hypersphere, we propose to

use a distribution defined on the hypersphere -

von-Mises-Fisher distribution:

where is the modified Bessel

function of the first kind of order v.

Davidson, T. R., Falorsi, L., De Cao, N., Kipf, T., & Tomczak, J. M. (2018). Hyperspherical Variational Auto-Encoders. UAI 2018

Page 85: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

85

Hyperspherical VAE

● We define the latent space to be

● The variational dist. is the von-Mises-Fisher, and the prior is uniform, i.e., von-Mises-

Fisher with . Then the KL term is as follows:

Page 86: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

86

Hyperspherical VAE

● We define the latent space to be

● The variational dist. is the von-Mises-Fisher, and the prior is uniform, i.e., von-Mises-

Fisher with . Then the KL term is as follows:

● There exist an efficient sampling procedure using Householder transformation (Ulrich,

1984).

● The reparameterization trick could be achieved by using the rejection sampling

(Naesseth et al., 2017).

Page 87: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

87

Hyperspherical VAE: Results on MNIST

Page 88: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

88

Hyperspherical VAE: Results on MNIST

Page 89: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

89

Hyperspherical VAE: Results on semi-supervised MNIST

Page 90: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

90

Hyperspherical GraphVAE: Link prediction

Page 91: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

91

Components of VAEs

Normalizing flows

Discrete encoders

Hyperspherical dist.

Hyperbolic-normal dist.

Group theory

Resnets

DRAW

Autoregressive models

Normalizing flows

Autoregressive models

Normalizing flows

VampPrior

Implicit prior

Adversarial learning

MMD

Wasserstein AE

Page 92: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

92

● There is a discrepancy between

posteriors and the Gaussian prior

that results in regions that were

never “seen” by the posterior

(holes). ⇾multi-modal prior

● Sampling process could produce

unrealistic samples.

Problems of holes in VAEs

Rezende, D.J. and Viola, F., 2018. Taming VAEs. arXiv preprint arXiv:1810.00597.

Multi-modal priorStandard

Page 93: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

93

● Let’s rewrite ELBO over the training data:

Looking for the optimal prior

Tomczak, J.M., Welling, M. (2018), VAE with a VampPrior, AISTATS 2018

Page 94: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

94

● Let’s rewrite ELBO over the training data:

Looking for the optimal prior

Tomczak, J.M., Welling, M. (2018), VAE with a VampPrior, AISTATS 2018

Page 95: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

95

● Let’s rewrite ELBO over the training data:

● KL = 0 iff , then the optimal prior = aggregated posterior.

Looking for the optimal prior

Tomczak, J.M., Welling, M. (2018), VAE with a VampPrior, AISTATS 2018

Page 96: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

96

● Let’s rewrite ELBO over the training data:

● KL = 0 iff , then the optimal prior = aggregated posterior.

● Summing over all training data is infeasible and since the sample is finite, it could cause

some additional instabilities.

Looking for the optimal prior

Tomczak, J.M., Welling, M. (2018), VAE with a VampPrior, AISTATS 2018

Page 97: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

97

● Let’s rewrite ELBO over the training data:

● KL = 0 iff , then the optimal prior = aggregated posterior.

● Summing over all training data is infeasible and since the sample is finite, it could cause

some additional instabilities. Instead we propose to use:

Looking for the optimal prior

Tomczak, J.M., Welling, M. (2018), VAE with a VampPrior, AISTATS 2018

Page 98: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

98

● Let’s rewrite ELBO over the training data:

● KL = 0 iff , then the optimal prior = aggregated posterior.

● Summing over all training data is infeasible and since the sample is finite, it could cause

some additional instabilities. Instead we propose to use:

Looking for the optimal prior

Tomczak, J.M., Welling, M. (2018), VAE with a VampPrior, AISTATS 2018 Multi-modal prior

Page 99: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

99

● Let’s rewrite ELBO over the training data:

● KL = 0 iff , then the optimal prior = aggregated posterior.

● Summing over all training data is infeasible and since the sample is finite, it could cause

some additional instabilities. Instead we propose to use:

Looking for the optimal prior

Tomczak, J.M., Welling, M. (2018), VAE with a VampPrior, AISTATS 2018

pseudoinputs are trained

from scratch by SGD

Page 100: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

100

VampPrior: Experiments (pseudoinputs)

Page 101: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

101

VampPrior: Experiments (samples)

Page 102: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

102

VampPrior: Experiments (reconstructions)

Page 103: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

103

Flow-based models

Page 104: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

104

The change of variables formula

● Let’s recall the change of variables formula with invertible transformations:

● We can think of it as an invertible neural network:

Rippel, O., & Adams, R. P. (2013). High-dimensional probability estimation with deep density models. arXiv preprint arXiv:1302.5125.

0 0 0

...

pixel spacelatent space

Page 105: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

105

The change of variables formula

● Let’s recall the change of variables formula with invertible transformations:

● We can think of it as an invertible neural network:

0 0 0

...

pixel spacelatent space

Rippel, O., & Adams, R. P. (2013). High-dimensional probability estimation with deep density models. arXiv preprint arXiv:1302.5125.

Page 106: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

106

RealNVP

● Design the invertible transformations as follows:

Dinh, L., Sohl-Dickstein, J., & Bengio, S. (2016). Density estimation using real nvp. arXiv preprint arXiv:1605.08803.

Page 107: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

107

RealNVP

● Design the invertible transformations as follows:

● Invertible by design:

Dinh, L., Sohl-Dickstein, J., & Bengio, S. (2016). Density estimation using real nvp. arXiv preprint arXiv:1605.08803.

Page 108: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

108

RealNVP

● Design the invertible transformations as follows:

● Invertible by design:

● Easy Jacobian:

Dinh, L., Sohl-Dickstein, J., & Bengio, S. (2016). Density estimation using real nvp. arXiv preprint arXiv:1605.08803.

Page 109: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

109

Results

Page 110: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

110

Results

Page 111: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

111

GLOW

● Adding trainable 1x1 convolution followed

by affine coupling layer.

● Adding actnorm.

Kingma, D. P., & Dhariwal, P. (2018). Glow: Generative flow with invertible 1x1 convolutions. NIPS

Page 112: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

112

Results

Kingma, D. P., & Dhariwal, P. (2018). Glow: Generative flow with invertible 1x1 convolutions. NIPS

Page 113: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

113

Future directions

Page 114: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

114

Blurriness and sampling in VAEs

● How to avoid sampling from holes?

● Should we follow geodesics in the

latent space?

● How to use geometry of the latent

space to build better decoders?

● How to build temporal decoders?

Can we do better than Conv3D?

Page 115: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

115

Compression and VAEs

● Taking a deterministic encoder

allows to simplify the objective.

● It is important to learn a powerful

prior. This is challenging!

● Is it easier to learn a prior with

temporal dependencies?

● Can we alleviate some dependencies

by using hypernets?

Page 116: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

116

Active learning/RL and VAEs

● Using latent representation to

navigate and/or quantify uncertainty.

● Formulating policies in the latent

space entirely.

● Do we need a better notion of

sequential dependencies?

Page 117: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

117

Hybrid and flow-based models

● We need a better understanding of

the latent space.

● Joining an invertible model (flow-

based model) with a predictive

model.

● Isn’t this model an overkill?

● How would it work in the multi-

modal learning scenario?

Page 118: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

118

Hybrid models and OOO sample

● Going back to first slides, we need a

good notion of p(x).

● Distinguishing out-of-distribution

(OOO) samples is very important.

● Crucial for decision making, outlier

detection, policy learning…

Page 119: May 22, 2019 @UvA Amsterdam - GitHub Pages · Deep generative modeling Jakub M. Tomczak Deep Learning Researcher (Engineer, Staff) Qualcomm AI Research Qualcomm Technologies Netherlands

References in this presentation to “Qualcomm” may mean Qualcomm

Incorporated, Qualcomm Technologies, Inc., and/or other subsidiaries

or business units within the Qualcomm corporate structure, as

applicable. Qualcomm Incorporated includes Qualcomm’s licensing

business, QTL, and the vast majority of its patent portfolio. Qualcomm

Technologies, Inc., a wholly-owned subsidiary of Qualcomm

Incorporated, operates, along with its subsidiaries, substantially all of

Qualcomm’s engineering, research and development functions, and

substantially all of its product and services businesses, including its

semiconductor business, QCT.

Follow us on:

For more information, visit us at:

www.qualcomm.com & www.qualcomm.com/blog

Thank you!

Nothing in these materials is an offer to sell any of the

components or devices referenced herein.

©2019 Qualcomm Technologies, Inc. and/or its affiliated

companies. All Rights Reserved.

Qualcomm is a trademark of Qualcomm Incorporated,

registered in the United States and other countries. Other

products and brand names may be trademarks or registered

trademarks of their respective owners.

Qualcomm AI Research is an initiative of Qualcomm

Technologies Inc.