Click here to load reader
Click here to load reader
Oct 14, 2014
Department of Computer Sc. & Engg.
Seminar Report Topic : Reversible Logic Gates & circuitspresented on : 26th February,2007
presented by :
Indranil NandyMTech(CS),2007 Roll : 06CS6010 e-mail : [email protected]
under the guidance of :
Prof. Indranil Sen GuptaHead, School of Information Technology Indian Institute of Technology
1. Introductory ChapterIntroduction..3 How all it came4
2. Reversible Gates and Circuits: Detailed AnalysisBackground..6 Definitions....6How to represent a reversible circuit truth table? How to encode a reversible circuit? Temporary Storage Representing Circuits by graphs
Discussion..11 Some special types of reversible gates...12
3. A Family of Logical Fault Models for ReversibleCircuits Introduction....14 Trapped-Ion Technology...........15 Fault Models...16Single Missing-Gate Fault Model Repeated-Gate Fault Model Multiple Missing-Gate Fault Model Partial Missing-Gate Fault Model
4. Testable Reversible GatesIntroduction.20Gate R Gates R1 & R2
Reversible Gates With a Built-in Testability..23 Two-pair rail checker...23 Synthesis of the reversible logic circuits.25 CMOS realization of the proposed reversible logic gates....27 Estimation of Power30
5. Reversible Memory ElementsIntroduction.32 Addressing the problem of fan-out..33 Constructing a new reversible RS-latch...34 Reversible clocked flip-flops...35Master-Slave flip-flop D flip-flop JK flip-flop T flip-flop
6. Quantum Search ApplicationsWhat is quantum computing?.............................38 Quantum Computation & Reversible Computation..38 Quantum Computing : Bits and Qubits.... 38 Quantum Search......39
Appendix AIrreversibility and Heat Generation.43
Chapter 1 Introductory Chapter Introduction:In most computing tasks, the number of output bits is relatively small compared to the number of input bits. For example, in a decision problem, the output is only one bit (yes or no) and the input can be as large as desired. However, computational tasks in digital signal processing, communication, computer graphics, and cryptography require that all of the information encoded in the input be preserved in the output. Some of those tasks are important enough to justify adding new microprocessor instructions to the HP PA-RISC (MAX and MAX-2), Sun SPARC (VIS), PowerPC (AltiVec), IA-32 and IA-64 (MMX) instruction sets In particular, new bit-permutation instructions were shown to vastly improve performance of several standard algorithms, including matrix transposition and DES, as well as two recent cryptographic algorithms Twofish and Serpent. Bit permutations are a special case of reversible functions, that is, functions that permute the set of possible input values. For example, the butterfly operation (x,y) (x+y,xy) is reversible but is not a bit permutation. It is a key element of Fast Fourier Transform algorithms and has been used in application-specific Xtensa processors from Tensilica. One might expect to get further speedups by adding instructions to allow computation of an arbitrary reversible function. The problem of chaining such instructions together provides one motivation for studying reversible computation and reversible logic circuits, that is, logic circuits composed of gates computing reversible functions. Reversible circuits are also interesting because the loss of information associated with irreversibility implies energy loss. Younis and Knight showed that some reversible circuits can be made asymptotically energy-lossless as their delay is allowed to grow arbitrarily large. [Excerpt from "Asymptoticay Zero Energy Split-Level Charge Recovery Logic" : Younis & Knight : Power dissipation in conventional CMOS primarily occurs during device switching. One component of this dissipation is due to charging and discharging the gate capacitances through conducting, but slightly resistive, devices. We note here that it is not the charging or the discharging of the gate that is necessarily dissipative, but rather that a small time is allocated to perform these operations. In conventional CMOS, the time constant associated with charging the gate through a similar transistor is RC, where R is the ON resistance of the device and C its capacitance. However, the cycle time can be, and usually is, much longer than RC. An obvious conclusion is that energy consumption can be reduced by spreading the transitions over the whole cycle rather than "squeezing" it all inside one RC. To successfully spread the transition over periods longer than RC, we insist that two conditions apply throughout the operation of our circuit. Firstly, we forbid any device in our circuit from turning ON while a potential difference exists across it. Secondly, once the device is switched ON, the energy transfer through the device occurs in a controlled and gradual manner to prevent a potential from developing across it. These conditions place some interesting restrictions on the way we usually perform computations. To perform a non-dissipative transition of the output, we must know the state of the output prior to and during this output transition. Stated more clearly, to non-dissipatively reset the state of the output we must at all times have a copy of it. The only way out of this circle is to use reversible logic. It is this observation that is the core of our low energy charge recovery logic.]
Currently, energy losses due to irreversibility are dwarfed by the overall power dissipation, but this may change if power dissipation improves. In particular, reversibility is important for nanotechnologies where switching devices with gain are difficult to build. Finally, reversible circuits can be viewed as a special case of quantum circuits because quantum evolution must be reversible. Classical (non-quantum) reversible gates are subject to the same circuit rules, whether they operate on classical bits or quantum states. In fact, popular universal gate libraries for quantum computation often contain as subsets universal gate libraries for classical reversible computation. While the speed-ups which make quantum computing attractive are not available without purely quantum gates, logic synthesis for classical reversible circuits is a first step toward synthesis of quantum circuits. Moreover, algorithms for quantum communications and cryptography often do not have classical counterparts because they act on quantum states, even if their action in a given computational basis corresponds to classical reversible functions on bit-strings. Quantum circuits require complete reversibility. Quantum circuits and algorithms offer additional benefits in terms of asymptotic runtime. While purely quantum gates are necessary to achieve quantum speed-up, variants of conventional reversible gates are also commonly used in quantum algorithms. For example, the textbook implementation of Grover's quantum search algorithm uses many NCT (NOT, CNOT, and TOFFOLI) gates. Hence, efficient synthesis with such gates is an important step toward quantum computation. Toffoli showed that the NCT gate library is universal for the synthesis of reversible boolean circuits. This has been recently extended to show that all even permutations can be synthesized with no temporary storage lines, and that odd permutations require exactly one extra line. Optimal circuits for all three-bit reversible functions can be found in several minutes by dynamic programming. This algorithm also synthesizes optimal fourbit circuits reasonably quickly, but does not scale much further. More scalable constructive synthesis algorithms tend to produce suboptimal circuits even on three bits, which suggests iterative optimization based on local search.
How All It Came?Question :What will be the difficulties when we will try to build classical computers (Turing machines) on the atomic scale? Answer : One of the toughest problems to scale down computers is the dissipated heat that is difficult to remove. Physical limitations placed on computation by heat dissipation were studied for many years . The usual digital computer program frequently performs operations that seem to throw away information about the computer's history, leaving the machine in a state whose immediate predecessor is ambiguous. Such operations include erasure or overwriting of data, and entry into a portion of the program addressed by several different transfer instructions. In other words, the typical computer is logically irreversible its transition function (the partial function that maps each whole-machine state onto its successor, if the state has a successor) lacks a single-valued inverse. Landauer [ 3 ] has posed the question of whether logical irreversibility is an unavoidable feature of useful computers, arguing that it is, and has demonstrated the physical and philosophical importance of this question by showing that whenever a physical computer throws away information about its previous state it must generate a corresponding amount of entropy. Therefore, a computer must dissipate at least kTln2 of energy (about 3 X 10-21 joule at room temperature) for each bit of information it erases or otherwise throws away.
In his classic 1961 paper [3, Appendix A], Rolf Landauer attempted to apply thermodynamic reasoning to digital computers. Paralleling the fruitful distinction in statistical physics between macroscopic and microscopic degrees of freedom, he noted that some of a computers degrees of freedom are used to encode the logical state of the computation, and these information bearing degrees of freedom (IBDF) are by design sufficiently robust that, within limits, the computers logical (i.e. digital) state evolves deterministically as a fu