Top Banner
Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University
49

Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Jan 12, 2016

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Inverting Well Conditioned Matrices in Quantum

LogSpace

Amnon Ta-Shma

Tel-Aviv University

Page 2: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Space Bounded Complexity

Space complexity measures the memory

size needed for solving a problem.

Page 3: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

An Example: Multiplying two matrices

Input: Two n n matrices A,B.

Output: C = AB.

Algorithm: Ci,j = Ai,k Bk,j

For i=1,..,nFor j=1,..,n c=0; For k=1,..,n c = c + Ai,k Bk,j; output c;

We do not count the input as working area, because we are

not allowed to change it.We do not count the output as working area, because we are not allowed to read or change it. We view it as sending the output to a

printer.

We only count memory elements that we can

read, write and change (i,j,c,k) as working area.

Page 4: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

An Example: Multiplying two matrices

Input: Two n n matrices A,B.

Output: C = AB.

Algorithm: Ci,j = Ai,k Bk,j

• The input is not counted• The output is not counted• The only thing we count is memory we can read,

write and change.

The algorithm above runs in O(log n) space.

Page 5: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Another Example: Undirected connectivity

Input: An undirected graph G=(V,E).

Output: Is the graph connected?

• Can be solved with linear space and time.• Omer Reingold showed the problem can be solved

with logarithmic space and polynomial time.

Page 6: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Problems not known to be in Log

• Connectivity of directed graphs.• Determinant of an integer matrix.• Inverting an integer matrix.

NL – complete.

DET – complete.

NL – Non-deterministic Logspace.

DET – all languages that are LogSpace reducible to integer determinant.

Page 7: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

What is known

Log NL DET DSPACE(log2n)

STCON is NL complete

Matrix inversion, int determinant are DET complete.

Page 8: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Probabilistic space-bounded computation

BPL – the class of languages that are solvable by space-bounded machines that have online access to an unbounded sequence of truly uniform bits.

Log BPL DET

BPL DSPACE(log1.5n) [SaksZhou]

Page 9: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Quantum space-bounded computation

BQL – all languages solvable by a LOG machine that may use O(log n) qubits.

• Counting the number of qubits is a natural complexity measure.

• The definition has several variants, and we will discuss it soon.

Page 10: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

What do we know about BQL?

Log BPL BQL DSPACE(log2n)

Not much else is known.

No natural candidate for a language in BQL

not known to be in BPL.

Page 11: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

In this talk

• We will modify an algorithm of Harrow, Hassidim, Lloyd for approximated matrix inversion.

• HHL studied quantum time complexity.

We will study quantum space cpmplexity.

• We will show the problem is in BQL• The problem is not known to be in BPL.

Page 12: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Quadratic gap

• This is first natural candidate for a problem in BQL not in BPL.

• It Presents a quadratic gap between BQL and what we currently know in BPL, and this gap is best possible.

• Our work might lead to new classical algorithms.

Page 13: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Defining BQL

Page 14: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Deterministic Space TM

• Input tape: Read only, Head moves in all directions • Output tape: Write only, Head moves Left• Work tape: Read/Write, All directions.

Page 15: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Quantum space-bounded machines

• An additional quantum tape with O(log n) qubits.• Two heads over the quantum tape.• The allowed quantum operations are:

HAD, CNOT, T plus

measurements M in the standard basis.

H

H

M

H

H

T T

MTTT

T

MX XX

XX

X

Intermediate measurement

Page 16: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Classical control

We use the usual function mechanism. The function only depends on the classical data.

: Q x Input x Work Q x Work x out x L,R4

(qCNOT) applies CNOT on the qubits under the two heads

Similarly for (qHAD) and (qT)

(qM) measures the qubit under the first head in the standard basis. Moves to qM,0,qM,1 depending on answer.

Page 17: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

BQL

• O(log n) classical bits and qubits.• Classical control.• Intermediate measurements.

BQL without intermediate measurements is also interesting but possibly much weaker.

H

H

M

H

H

T T

MTTT

T

MX XX

XX

X

Page 18: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Matrix inversion and the HHL algorithm

Page 19: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Time complexity of Matrix inversion

• Can be solved as fast as matrix

multiplication. Current best time O(n), 2.37.

• Matrix inversion depends on all input bits and so the time complexity must be (n2).

Page 20: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

The HHL problem The HHL algorithm studies a modified

version of matrix inversion:

Input: A matrix A, a vector b,

Output: Approximation of certain predicates of x=A-1b

Since we deal with approximation, the input

matrix has to be stable.

Page 21: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Stability – the condition number

• Matrix inversion is not stable, if there exists an eigenvalue close to 0.

• Matrix inversion is stable if all eigenvalues are far from zero.

The condition number (A) is defined to be

(A)= ||A|| / ||A-1||

Page 22: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

HHL’09

Input: A matrix A, a vector b,

condition number k

Output: Approximation of certain

predicates of x=A-1b

• Quantum Time complexity: O(k log n).• Exponentially faster than the classical time

bound Ω(n).

Page 23: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

HHL - Summary

• Only an approximation• Only for well conditioned A• Only for sparse matrices A• Only for special b• Only for certain predicates over x.

Very nice idea!

Surprising technique and result.

Page 24: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Our result

Input: A matrix A,

condition number k

Output: Approximation of A-1

• Quantum space complexity: O(log(kn)).• Currently best classical bound O(log2n).

Quadratic gap.

Page 25: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

The technique

Page 26: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Basic idea:Sampling the spectrum using phase estimation

Page 27: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

First observation: We can work with Hermitian matrices

Given input A. We look for the SVD, A=UDV.

Define H= , H is Hermitian.

The SVD of H is

And it so happens that one can read A’s

decomposition from H’s decomposition.

We also assume all eigenalues are well-separated.

0

0 A

A†

0

0

V

U†

0

0 D

D†0

0U

V†

Page 28: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Basic approachInput: Hermitian A.

U=eiA is unitary.

Assume:• We can simulate Ut for t=1,…,T, and,• We know an eigenvector v of U.

Then, using phase estimation, we estimate

the eigenvalue λ associated with v.

Page 29: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

First challenge: How do we find an eigenvector?

Classically: A big question.

Once we know A and an eigenvector v,

We can easily compute in small space.

Page 30: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Sampling instead of finding an eigenvector

The completely mixed state I is the mixture

Obtained by taking a uniform eigenvector of A.

If we apply phase estimation on I we sample

a random (eigenvector,eigenvalue) pair of A.

We can generate the uniform distribution over

the eigenvectors of A, even though we do not

know any specific eigenvecctor.

Page 31: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

2nd Challenge: Simulate U=eiA

HAD, CNOT, T is a universal basis,

Hence any unitary U can be approximated

by a circuit with these gates.

The challenge is designing a deterministic

Log space algorithm that given A produces

a quantum circuit over HAD, CNOT, T

That approximates U=eiA.

Page 32: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Reminder: Universality of HAD, CNOT, T

Given a unitary U:

1. Decompose U to a product of 2-level unitaries.

2. Convert a 2-level unitary to a product of CNOT and 1-qubit unitaries.

3. Approximate any 1-qubit unitary by a short product of HAD, T

A unitary that acts non-trivially only on a 2 dimensional subspace spanned by 2 standard basis vectors.

Using the Solovay Kitaev Theorem.

Page 33: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Simulating U=eiA in small space

Given a unitary U:

1. Approximately decompose it to a product of 2-level unitaries, using Trotter formula.

2. Convert a 2-level unitary to a product of CNOT and 1-qubit unitaries.

3. Approximate any 1-qubit unitary by a short product of HAD, T using a space-efficient version of the Solovay-Kitaev theorem, recently proved by [vM,W].

Page 34: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Altogether:

Given A:

Run phase estimation with U=eiA on the

completely mixed state.

This uniformly sample an approximation of

an (eigenvector, eigenvalue) pair, in

logarithmic space.

Page 35: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Approximating the whole spectrum

Page 36: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

First attempt: Repeated sampling

• Assume all eigenvalues are in [-1,1] .• Divide [-1,1] to small consecutive intervals.• For each interval, pick poly(n) independent

samples, and estimate the number of eigenvalues in the interval by the fraction of samples that fall into it.

-1 1

Page 37: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

A problem: eigenvalues close to a boundary

Eigenvalues that lie close to an interval boundary might fall into both neighboring intervals and lead to wrong results.

-1 1

Page 38: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

The solution: Consistent estimation

A probabilistic/quantum algorithm estimates a

value z, if w.h.p. it outputs a value close to z.

A probabilistic/quantum algorithm consistently

estimates a value z, if w.h.p. it outputs a fixed

value close to z.

Consistent sampling solves the problem above.

Page 39: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Consistent Sampling usingthe shift & truncate method [SZ]

Original accuracy: 2-10

2-10

Page 40: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Consistent Sampling usingthe shift & truncate method [SZ]

Original accuracy: 2-10 New accuracy: 2-20

Pick uniformly a value 0 < k < 210 and fix it.

Shift the eigenvalues by the fixed shift k* 2-20

Now, w.h.p., all eigenvalues are far away from a boundary.

2-10

Page 41: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Approximate the spectrum using consistent sampling

• Divide [-1,1] to small consecutive intervals.• For each interval, pick poly(n) independent

samples, and estimate the number of eigenvalues in the interval by the fraction of samples that fall into it.

-1 1

Page 42: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Approximating the eigenvectors

Page 43: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Quantum state tomography

Quantum tomography is the process of reconstructing the quantum state for a source by measurements on the systems coming from the source.

Quantum tomography is possible if we can repeatedly and consistently generate the same state.

Page 44: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Estimating an eigenvector

We saw we can consistently estimate an

eigenvalue i. Each time we get i we have

the n-dimensional eigenvector vi, represented

with log(n) qubits.

Using quantum state tomography we

efficiently output the n coordinates of vi.

Page 45: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Quantum tomography in small space

Where:E (1) projects onto |k>, E (2) projects onto |l>,E (3) projects onto |k>+| l >, and,E (4) projects onto |k>+i | l >,

For each k, l:

Page 46: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Inverting a matrix

Page 47: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Inverting a matrix whose eigenvalues are well separated.

• Approximate the eigenvalues,

D=Diag(1,…, n)

• Approximate the eigenvectors v1,…,vn.

V=(v1,…,vn)

Then,

A VDV†

A-1 VD-1V†

Page 48: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

Some reflections

Page 49: Inverting Well Conditioned Matrices in Quantum LogSpace Amnon Ta-Shma Tel-Aviv University.

BQL is surprisingly powerfulEither:• BQL is indeed stronger than BPL, or• BPL is also surprisingly powerful.

Reingold showed USTCON L,

So far, this was not extended to RL=L.

An intriguing question : Can one approximately

invert stochastic matrices in BPL?