Top Banner
Physics I Physics I Entropy: Entropy: Reversibility, Reversibility, Disorder, and Disorder, and Information Information Prof. WAN, Xin [email protected] http://zimp.zju.edu.cn/~xin wan/
33

Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin [email protected] xinwan

Jan 01, 2016

Download

Documents

Rose Kelley
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Physics IPhysics I

Entropy: Reversibility, Entropy: Reversibility, Disorder, and InformationDisorder, and Information

Prof. WAN, Xin

[email protected]://zimp.zju.edu.cn/~xinwan/

Page 2: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

1st & 2nd Laws of Thermodynamics1st & 2nd Laws of Thermodynamics

The 1st law specifies that we cannot get more energy out of a cyclic process by work than the amount of energy we put in.

The 2nd law states that we cannot break even because we must put more energy in, at the higher temperature, than the net amount of energy we get out by work.

WQU

h

c

h

c

h T

T

Q

Q

Q

W 11 carnot

Page 3: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Carnot’s EngineCarnot’s Engine

Page 4: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Efficiency of a Carnot EngineEfficiency of a Carnot Engine

All Carnot engines operating between the same two temperatures have the same efficiency.

Page 5: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

An EqualityAn Equality

Now putting in the proper signs,

0c

c

h

h

T

Q

T

Q

0CycleCarnot

T

dQ

negativepositive

Page 6: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

A Sum of Carnot CyclesA Sum of Carnot Cycles

0,

,

,

, i ic

ic

ih

ih

T

Q

T

Q

0C T

dQ

V

Padiabats Th,i

Tc,i

Any reversible process can be approximated by a sum of Carnot cycles, hence

Page 7: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Clausius Definition of EntropyClausius Definition of Entropy

Entropy is a state function, the change in entropy during a process depends only on the end points and is independent of the actual path followed.

T

dQdS reversible

012,21, 21

CCC

dSdSdS

21,12,21,

12

221 CCC

dSdSdSSS1

2

C1

C2

Page 8: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Return to Inexact DifferentialReturn to Inexact Differential

12ln)2,2(

)2,1(

)2,1(

)1,1(

dy

y

xdx

0lnln),( fyxyxf

2ln21)2,2(

)1,2(

)1,2(

)1,1(

dy

y

xdx

dyy

xdxdg Assume

Note: is an exact differential.

Integrating factor

y

dy

x

dx

x

dgdf

Page 9: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Digression on Multivariate CalculusDigression on Multivariate Calculus

Heat is path dependent.

Therefore, 1/T is really the integrating factor for the differential form of heat. Now we can recast the 1st law of thermodynamics as

Entropy is also a state function, as is the internal energy or volume.

PdVdUdQ

PdVTdSdU

Page 10: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Entropy of an Ideal Gas (1 mole)Entropy of an Ideal Gas (1 mole)

TfR

TCTU molV 2

)(

000 lnln),(

V

VR

T

TCSVTS mol

V

V

RTVTp ),(

V

RdV

T

dTCpdVdU

TdS

molV

1

Integrating from (T0,V0) to (T, V)

Page 11: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Carnot’s TheoremCarnot’s Theorem

No real heat engine operating between two energy reservoirs can be more efficient than Carnot’s engine operating between the same two reservoirs.

0''

1'

'1'

c

c

h

h

h

c

h

c

T

Q

T

Q

T

T

Q

Qe

0dS

What does this mean? Still, for any engine in a cycle (S is a state function!)

negativepositive

Page 12: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Counting the Heat Baths inCounting the Heat Baths in

h

hh T

QS

'

0 dSSgas

c

cc T

QS

'

after a cycle

Q'h > 0

Q'c < 0

0'

0'

c

c

h

hcgash T

Q

T

QSSSS

Page 13: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Counting the Heat Baths inCounting the Heat Baths in

h

hh T

QS

'

0 dSSgas

c

cc T

QS

'

after a cycle

Q'h > 0

Q'c < 0

The total entropy of an isolated system that undergoes a change can never decrease.

Page 14: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Example 1: Clausius StatementExample 1: Clausius Statement

hh T

QS

cc T

QS

0ch

ch T

Q

T

QSSS

Irreversible!

Page 15: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Example 2: Kelvin StatementExample 2: Kelvin Statement

0T

QS

Irreversible!

Page 16: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Example 3: Mixing WaterExample 3: Mixing Water

TA

A

TB

BQTA TB T T

Q

TA< TB

BA

BBAA

B

A

B

A

mm

TmTmT

TT

TT

m

m

AAA TTcmQTT :

TTcmQTT BBB :

Page 17: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Example 3: Mixing WaterExample 3: Mixing Water

Irreversible!

TA

A

TB

BQTA TB T T

Q

TA< TB

0ln2

BABA TT

TcmSSS

0ln:

AA

T

T

AAA T

TcmT

dTcmSTT

A

2/, BABA TTTmmm

0ln:

BB

T

T

BBB T

TcmT

dTcmSTT

B

For simplicity, assume

Page 18: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

We can only calculate S with a reversible process! In this case, we replace the free expansion by the isothermal process with the same initial and final states.

Example 4: Free ExpansionExample 4: Free Expansion

00 SWQU

0ln

i

fV

V

V

V

V

V VV

nRV

nRdV

T

PdV

T

dQS

f

i

f

i

f

i

?

Irreversible!

Page 19: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

The Second Law in terms of EntropyThe Second Law in terms of Entropy

The total entropy of an isolated system that undergoes a change can never decrease.

– If the process is irreversible, then the total entropy of an isolated system always increases.

– In a reversible process, the total entropy of an isolated system remains constant.

The change in entropy of the Universe must be greater than zero for an irreversible process and equal to zero for a reversible process.

0 UniverseS

Page 20: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Order versus DisorderOrder versus Disorder

Isolated systems tend toward disorder and that entropy is a measure of this disorder.

Ordered: all molecules on the left side

Disordered: molecules on the left and right

Page 21: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Macrostate versus MicrostateMacrostate versus Microstate

Each of the microstates is equally probable. Ordered microstate to be very unlikely because random

motions tend to distribute molecules uniformly. There are many more disordered microstates than ordere

d microstates. A macrostate corresponding to a large number of equival

ent disordered microstates is much more probable than a macrostate corresponding to a small number of equivalent ordered microstates.

How much more probable?

Page 22: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Entropy: A Measure of DisorderEntropy: A Measure of Disorder

2lnln Bi

fB NkV

VNkS

WTkS B ln

N

m

ff V

VW

N

m

ii V

VW

N

i

f

i

f

VV

WW

We assume that each molecule occupies some microscopic volume Vm.

suggesting (Boltzmann)

Page 23: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

A Similar Probability ProblemA Similar Probability Problem

Page 24: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Let’s Play CardsLet’s Play Cards

Imagine shuffling a deck of playing cards:

– Systems have a natural tendency to become more and more disordered.

Disorder almost always increases is that disordered states hugely outnumber highly ordered states, such that the system inevitably settles down in one of the more disordered states.

Page 25: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Computers are useless. They can only give us answers.

---- Pablo Picasso

Page 26: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Information and EntropyInformation and Entropy

(1927) Bell Labs, Ralph Hartley– Measure for information in a message

– Logarithm: 8 bit = 28 = 256 different numbers

(1940) Bell Labs, Claude Shannon– “A mathematical theory of communication”

– Probability of a particular message

But there is no information.

You are not winning the lottery.

Page 27: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Information and EntropyInformation and Entropy

(1927) Bell Labs, Ralph Hartley– Measure for information in a message

– Logarithm: 8 bit = 28 = 256 different numbers

(1940) Bell Labs, Claude Shannon– “A mathematical theory of communication”

– Probability of a particular messageOkay, you are going to win the lottery.

Now that’s something.

Page 28: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Information and EntropyInformation and Entropy

(1927) Bell Labs, Ralph Hartley– Measure for information in a message

– Logarithm: 8 bit = 28 = 256 different numbers

(1940) Bell Labs, Claude Shannon– “A mathematical theory of communication”

– Probability of a particular message

– Information ~ - log (probability) ~ negative entropy

i

ii PPS loginfomation

Page 29: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

It is already in use under that name. … and besides, it will give you great edge in debates because nobody really knows what entropy is anyway.

---- John von Neumann

Page 30: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Maxwell’s DemonMaxwell’s Demon

To determine whether to let a molecule through, the demon must acquire information about the state of the molecule. However well prepared, the demon will eventually run out of information storage space and must begin to erase the information it has previously gathered. Erasing information is a thermodynamically irreversible process that increases the entropy of a system.

Page 31: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Landauer’s Principle & VerificationLandauer’s Principle & Verification

Computation needs to involve heat dissipation only when you do something irreversible with the information.

Lutz group (2012)

693.02ln Tk

Q

B

Page 32: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

Homework (for the 2nd Law)Homework (for the 2nd Law)

CHAP. 24 Exercises 25, 30, 35 (P565) 4, 8 (P566)

Page 33: Physics I Entropy: Reversibility, Disorder, and Information Prof. WAN, Xin xinwan@zju.edu.cn xinwan/

HomeworkHomework

Reading (downloadable from my website): – Charles Bennett and Rolf Landauer, The fund

amental physical limits of computation.– Antoine Bérut et al., Experimental verification

of Landauer’s principle linking information and thermodynamics, Nature (2012).

– Seth Lloyd, Ultimate physical limits to computation, Nature (2000).

Dare to adventure where you have not been!