Top Banner
Report of the May 10–11, 2002, Workshop Conducted by the Basic Energy Sciences and Advanced Scientific Computing Advisory Committees to the Office of Science, Department of Energy Theory and Modeling in Nanoscience
52
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • Report of the May 1011, 2002, Workshop Conducted by the Basic Energy Sciences and Advanced Scientific Computing Advisory Committees to the Office of Science,Department of Energy

    Theory and Modeling inNanoscience

  • Cover illustrations:

    TOP LEFT: Ordered lubricants confined to nanoscale gap (Peter Cummings).

    BOTTOM LEFT: Hypothetical spintronic quantum computer (Sankar Das Sarma and Bruce Kane).

    TOP RIGHT: Folded spectrum method for free-standing quantum dot (Alex Zunger).

    MIDDLE RIGHT: Equilibrium structures of bare and chemically modified gold nanowires (Uzi Landman).

    BOTTOM RIGHT: Organic oligomers attracted to the surface of a quantum dot (F. W. Starr and S. C. Glotzer).

  • Theory and Modeling in NanoscienceReport of the May 1011, 2002, WorkshopConducted by the Basic Energy Sciencesand Advanced Scientific ComputingAdvisory Committees to the Office of Science,Department of Energy

    Organizing CommitteeC. William McCurdy

    Co-Chair and BESAC RepresentativeLawrence Berkeley National Laboratory

    Berkeley, CA 94720

    Ellen StechelCo-Chair and ASCAC Representative

    Ford Motor CompanyDearborn, MI 48121

    Peter CummingsThe University of Tennessee

    Knoxville, TN 37996

    Bruce HendricksonSandia National Laboratories

    Albuquerque, NM 87185

    David KeyesOld Dominion University

    Norfolk, VA 23529

    This work was supported by the Director, Office of Science, Office of Basic Energy Sciences and Office ofAdvanced Scientific Computing Research, of the U.S. Department of Energy.

  • Table of ContentsExecutive Summary .......................................................................................................................1

    I. Introduction..............................................................................................................................3A. The Purpose of the Workshop..............................................................................................3

    B. Parallel Dramatic Advances in Experiment and Theory......................................................3

    C. The Central Challenge .........................................................................................................5

    D. Key Barriers to Progress in Theory and Modeling in Nanoscience.....................................6

    E. Consensus Observations ......................................................................................................7

    F. Opportunity for an Expanded Role for the Department of Energy......................................7

    G. Need for Computational Resources and Readiness to Use Them........................................8

    H. Summary of Specific Challenges and Opportunities ...........................................................9

    Giant Magnetoresistance in Magnetic Storage .......................................................................11

    II. Theory, Modeling, and Simulation in Nanoscience ............................................................12A. Nano Building Blocks........................................................................................................14

    Transport in Nanostructures: Electronic Devices ..............................................................14Optical Properties on the Nanoscale: Optoelectronic Devices ..........................................15Coherence/Decoherence Tunneling: Quantum Computing ...............................................15Soft/Hard Matter Interfaces: Biosensors............................................................................15Spintronics: Information Technology................................................................................16Implications for Theory and Modeling ..............................................................................16

    B. Complex Nanostructures and Interfaces ............................................................................17

    C. Dynamics, Assembly, and Growth of Nanostructures.......................................................21

    III.The Role of Applied Mathematics and Computer Science in Nanoscience......................24A. Bridging Time and Length Scales......................................................................................26

    B. Fast Algorithms..................................................................................................................31Linear Algebra in Electronic Structure Calculations .........................................................32Monte Carlo Techniques....................................................................................................33Data Exploration and Visualization ...................................................................................34Computational Geometry...................................................................................................35

    C. Optimization and Predictability .........................................................................................35Optimization ......................................................................................................................35Predictability ......................................................................................................................37Software .............................................................................................................................37

    Appendix A: Workshop Agenda ................................................................................................39

    Appendix B: Workshop Participants .........................................................................................41

  • 1Executive SummaryOn May 10 and 11, 2002, a workshopentitled Theory and Modeling in Nano-science was held in San Francisco to iden-tify challenges and opportunities for theory,modeling, and simulation in nanoscienceand nanotechnology and to investigate thegrowing and promising role of appliedmathematics and computer science in meet-ing those challenges. A broad selection ofuniversity and national laboratory scientistscontributed to the workshop, which includedscientific presentations, a panel discussion,breakout sessions, and short white papers.

    Revolutionary New Capabilities in Theory,Modeling, and SimulationDuring the past 15 years, the fundamentaltechniques of theory, modeling, and simula-tion have undergone a revolution that paral-lels the extraordinary experimental advanceson which the new field of nanoscience isbased. This period has seen the developmentof density functional algorithms, quantumMonte Carlo techniques, ab initio moleculardynamics, advances in classical Monte Carlomethods and mesoscale methods for softmatter, and fast-multipole and multigrid al-gorithms. Dramatic new insights have comefrom the application of these and other newtheoretical capabilities. Simultaneously, ad-vances in computing hardware increasedcomputing power by four orders of magni-tude. The combination of new theoreticalmethods together with increased computingpower has made it possible to simulate sys-tems with millions of degrees of freedom.

    Unmistakable Promise of Theory,Modeling, and SimulationThe application of new and extraordinaryexperimental tools to nanosystems has cre-ated an urgent need for a quantitative under-standing of matter at the nanoscale. The ab-

    sence of quantitative models that describenewly observed phenomena increasinglylimits progress in the field. A clear consen-sus emerged at the workshop that withoutnew, robust tools and models for the quan-titative description of structure and dynam-ics at the nanoscale, the research communitywould miss important scientific opportuni-ties in nanoscience. The absence of suchtools would also seriously inhibit wide-spread applications in fields of nanotechnol-ogy ranging from molecular electronics tobiomolecular materials. To realize the un-mistakable promise of theory, modeling, andsimulation in overcoming fundamentalchallenges in nanoscience requires newhuman and computer resources.

    Fundamental Challenges andOpportunitiesWith each fundamental intellectual andcomputational challenge that must be metin nanoscience comes opportunities for re-search and discovery utilizing the ap-proaches of theory, modeling, and simula-tion. In the broad topical areas of (1) nanobuilding blocks (nanotubes, quantum dots,clusters, and nanoparticles), (2) complexnanostructures and nano-interfaces, and (3)the assembly and growth of nanostructures,the workshop identified a large number oftheory, modeling, and simulation challengesand opportunities. Among them are:

    to bridge electronic through macroscopiclength and time scales

    to determine the essential science oftransport mechanisms at the nanoscale

    to devise theoretical and simulation ap-proaches to study nano-interfaces, whichdominate nanoscale systems and arenecessarily highly complex and hetero-geneous

  • 2 to simulate with reasonable accuracy theoptical properties of nanoscale structuresand to model nanoscale opto-electronicdevices

    to simulate complex nanostructures in-volving soft biologically or organi-cally based structures and hard in-organic ones as well as nano-interfacesbetween hard and soft matter

    to simulate self-assembly and directedself-assembly

    to devise theoretical and simulation ap-proaches to quantum coherence, deco-herence, and spintronics

    to develop self-validating and bench-marking methods

    The Role of Applied MathematicsSince mathematics is the language in whichtheory is expressed and advanced, develop-ments in applied mathematics are central tothe success of theory, modeling, and simu-lation for nanoscience, and the workshopidentified important roles for new appliedmathematics in the above-mentioned chal-lenges. Novel applied mathematics is re-quired to formulate new theory and to de-velop new computational algorithms appli-cable to complex systems at the nanoscale.

    The discussion of applied mathematicsat the workshop focused on three areas thatare directly relevant to the central challengesof theory, modeling, and simulation in nano-science: (1) bridging time and length scales,(2) fast algorithms, and (3) optimization andpredictability. Each of these broad areas hasa recent track record of developments fromthe applied mathematics community. Recentadvances range from fundamental ap-proaches, like mathematical homogenization(whereby reliable coarse-scale results aremade possible without detailed knowledge

    of finer scales), to new numerical algo-rithms, like the fast-multipole methods thatmake very large scale molecular dynamicscalculations possible. Some of the mathe-matics of likely interest (perhaps the mostimportant mathematics of interest) is notfully knowable at the present, but it is clearthat collaborative efforts between scientistsin nanoscience and applied mathematicianscan yield significant advances central to asuccessful national nanoscience initiative.

    The Opportunity for a New InvestmentThe consensus of the workshop is that thecountrys investment in the national nano-science initiative will pay greater scientificdividends if it is accelerated by a new in-vestment in theory, modeling, and simula-tion in nanoscience. Such an investment canstimulate the formation of alliances andteams of experimentalists, theorists, appliedmathematicians, and computer and compu-tational scientists to meet the challenge ofdeveloping a broad quantitative under-standing of structure and dynamics at thenanoscale.

    The Department of Energy is uniquelysituated to build a successful program intheory, modeling, and simulation in nano-science. Much of the nations experimentalwork in nanoscience is already supported bythe Department, and new facilities are beingbuilt at the DOE national laboratories. TheDepartment also has an internationally re-garded program in applied mathematics, andmuch of the foundational work on mathe-matical modeling and computation hasemerged from DOE activities. Finally, theDepartment has unique resources and expe-rience in high performance computing andalgorithms. The combination of these areasof expertise makes the Department of En-ergy a natural home for nanoscience theory,modeling, and simulation.

  • 3I. Introduction

    A. The Purpose of the Workshop

    On May 10 and 11, 2002, a workshop enti-tled Theory and Modeling in Nanosciencewas held in San Francisco, California, sup-ported by the offices of Basic Energy Sci-ence and Advanced Scientific ComputingResearch of the Department of Energy. TheBasic Energy Sciences Advisory Committeeand the Advanced Scientific ComputingAdvisory Committee convened the work-shop to identify challenges and opportunitiesfor theory, modeling, and simulation innanoscience and nanotechnology, and addi-tionally to investigate the growing andpromising role of applied mathematics andcomputer science in meeting those chal-lenges. The workshop agenda is reproducedin Appendix A.

    A broad selection of university and na-tional laboratory scientists were invited, andabout fifty were able to contribute to theworkshop. The participants are listed in Ap-pendix B. There were scientific presenta-tions, a panel discussion, and breakout ses-sions, together with written contributions inthe form of short white papers from those

    participants from the DOE labs. This reportis the result of those contributions and thediscussions at the workshop.

    This workshop report should be read inthe context of other documents that defineand support the National NanotechnologyInitiative. Those documents describe a broadrange of applications that will benefit theprincipal missions of the Department of En-ergy, ranging from new materials and theenergy efficiencies they make possible, toimproved chemical and biological sensing.Key among those reports is the one from theOffice of Basic Energy Sciences entitledNanoscale Science, Engineering and Tech-nology Research Directions (http://www.sc.doe.gov/production/bes/nanoscale.html),which points out the great need for theoryand modeling. Other nanoscience documentsfrom the Department of Energy can befound at http://www.er.doe.gov/production/bes/NNI.htm, and a number of reports fromother agencies and groups are linked to theWeb site of the National NanotechnologyInitiative, http://www.nano.gov/.

    B. Parallel Dramatic Advances in Experiment and Theory

    The context of the workshop was apparentat the outset. The rapid rise of the field ofnanoscience is due to the appearance overthe past 15 years of a collection of new ex-perimental techniques that have made ma-nipulation and construction of objects at thenanoscale possible. Indeed, the field hasemerged from those new experimental tech-niques.

    Some of those experimental methods,such as scanning tunneling microscopy, cre-

    ated new capabilities for characterizingnanostructures. The invention of atomicforce microscopy produced not only a toolfor characterizing objects at the nanoscalebut one for manipulating them as well. Thearray of experimental techniques for con-trolled fabrication of nanotubes and nano-crystals, together with methods to fabricatequantum dots and wells, produced an en-tirely new set of elementary nanostructures.Combinatorial chemistry and genetic tech-niques have opened the door to the synthesis

  • 4of new biomolecular materials and thecreation of nano-interfaces and nano-interconnects between hard and soft matter.Nanoscience arose from the entire ensembleof these and other new experimental tech-niques, which have created the buildingblocks of nanotechnology.

    Over the same 15-year period, the fun-damental techniques of theory, modeling,and simulation that are relevant to matter atthe nanoscale have undergone a revolutionthat has been no less stunning. The advancesof computing hardware over this period areuniversally familiar, with computing powerincreasing by four orders of magnitude, ascan be seen by comparing the Gordon BellPrizes in 1988 (1 Gflop/s) and 2001 (11Tflop/s). But as impressive as the increase incomputing power has been, it is only part ofthe overall advance in theory, modeling, andsimulation that has occurred over the sameperiod. This has been the period in which:

    Density functional theory (DFT) trans-formed theoretical chemistry, surfacescience, and materials physics and hascreated a new ability to describe theelectronic structure and interatomicforces in molecules with hundreds andsometimes thousands of atoms(Figure 1).

    Molecular dynamics with fast multipolemethods for computing long-range inter-atomic forces have made accurate cal-culations possible on the dynamics ofmillions and sometimes billions ofatoms.

    Monte Carlo methods for classicalsimulations have undergone a revolu-tion, with the development of a range oftechniques (e.g., parallel tempering,continuum configurational bias, and ex-tended ensembles) that permit extraordi-narily fast equilibration of systems withlong relaxation times.

    New mesoscale methods (including dis-sipative particle dynamics and field-theoretic polymer simulation) have beendeveloped for describing systems withlong relaxation times and large spatialscales, and are proving useful for therapid prototyping of nanostructures inmulticomponent polymer blends.

    Quantum Monte Carlo methods nowpromise to provide nearly exact descrip-tions of the electronic structures ofmolecules.

    The Car-Parrinello method for ab initiomolecular dynamics with simultaneouscomputation of electronic wavefunctionsand interatomic forces has opened theway for exploring the dynamics of mole-cules in condensed media as well ascomplex interfaces.

    The tools of theory have advanced asmuch as the experimental tools in nano-science over the past 15 years. It has been atrue revolution.

    The rise of fast workstations, clustercomputing, and new generations of mas-sively parallel computers complete the pic-ture of the transformation in theory, model-ing, and simulation over the last decade anda half. Moreover, these hardware (and basicsoftware) tools are continuing on theMoores Law exponential trajectory of im-provement, doubling the computing poweravailable on a single chip every 18 months.Computational Grids are emerging as thenext logical extension of cluster and parallelcomputing.

    The first and most basic consensus of theworkshop is clear: Many opportunities fordiscovery will be missed if the new tools oftheory, modeling, and simulation are notfully exploited to confront the challenges ofnanoscience. Moreover, new investments bythe DOE and other funding agencies will be

  • 5required to exploit and develop these toolsfor effective application in nanoscience.

    This consensus is not merely specula-tion, but is based on recent experience of therole of theory in the development of nano-technology. Perhaps no example is morecelebrated than the role of calculations basedon density functional theory in the develop-ment of giant magnetoresistance in magneticstorage systems. The unprecedented speedwith which this discovery was exploited insmall hard disk drives for computers de-pended on a detailed picture from theoreticalsimulations of the electronic structure andelectron (spin) transport in these systems.Some details of this remarkable story aregiven in a sidebar to this report (page 11).

    Figure 1. Nanotubes break by first forming a bondrotation 5-7-7-5 defect. An application of densityfunctional theory and multigrid methods byBuongiorno Nardelli, Yakobson, and Bernholc,Phys. Rev. B and Phys. Rev. Letters (1998).

    C. The Central Challenge

    The discussions and presentations of theworkshop identified many specific funda-mental challenges for theory, modeling, andsimulation in nanoscience. However, a cen-tral and basic challenge became clear. Be-cause of the rapid advance of experimentalinvestigations in this area, the need forquantitative understanding of matter at thenanoscale is becoming more urgent, and itsabsence is increasingly a barrier to progressin the field quite generally.

    The central broad challenge thatemerged from discussions at the workshopcan be stated simply: Within five to tenyears, there must be robust tools for quan-titative understanding of structure and dy-namics at the nanoscale, without which thescientific community will have missed manyscientific opportunities as well as a broadrange of nanotechnology applications.

    The workshop audience was reminded inthe opening presentation that the electronicsindustry will not risk deploying billions of

    Figure 2. Calculated current-voltage curve for anovel memory-switchable resistor with 5 5junctions. (Stan Williams, Hewlett-Packard)

    devices based on molecular electronics, evenwhen they can be built, unless they are thor-oughly understood (Figure 2) and manufac-turing processes are made predictable andcontrollable. The electronics industry must

  • 6have new simulations and models for nano-technology that are at least as powerful andpredictive as the ones in use today for con-ventional integrated circuits before it canchance marketing molecular electronics de-vices for a myriad of applications. It can beargued that biological applications of nano-technology will require the same level ofquantitative understanding before they arewidely applied.

    The fundamental theory and modelingon which industry will build those tools doesnot exist. While it is the province of industryto provide its own design tools, it is the roleof basic science to provide the fundamentalunderpinnings on which they are based.Those fundamental tools for quantitativeunderstanding are also necessary to the pro-gress of the science itself.

    D. Key Barriers to Progress in Theory and Modeling inNanoscience

    Much of the current mode of theoreticalstudy in nanoscience follows the traditionalseparation of the practice of experimentfrom the practice of theory and simulation,both separate from the underpinning appliedmathematics and computer science. This isnot a new observation, nor does it applyonly to theory, modeling, and simulation innanoscience. Nonetheless, it is a particularlyproblematic issue for this field.

    By its very nature, nanoscience involvesmultiple length and time scales as well asthe combination of types of materials andmolecules that have been traditionally stud-ied in separate subdisciplines. For theory,modeling, and simulation, this means thatfundamental methods that were developed inseparate contexts will have to be combinedand new ones invented. This is the key rea-son why an alliance of investigators in nano-science with those in applied mathematicsand computer science will be necessary tothe success of theory, modeling, and simu-lation in nanoscience. A new investment intheory, modeling and simulation in nano-science should facilitate the formation ofsuch alliances and teams of theorists, com-putational scientists, applied mathemati-cians, and computer scientists.

    A second impediment to progress intheory, modeling, and simulation in nano-

    science arises because theoretical efforts inseparate disciplines are converging on thisintrinsically multidisciplinary field. Thespecific barrier is the difficulty, given thepresent funding mechanisms and policies, ofundertaking high-risk but potentially high-payoff research, especially if it involves ex-pensive human or computational resources.At this early stage in the evolution of thefield, it is frequently not at all clear whattechniques from the disparate subdisciplinesof condensed matter physics, surface sci-ence, materials science and engineering,theoretical chemistry, chemical engineering,and computational biology will be success-ful in the new context being created everyweek by novel experiments. The traditionaldegree of separation of the practice of ex-periment from the practice of theory furtherintensifies the challenge.

    For these reasons, opportunities will bemissed if new funding programs in theory,modeling, and simulation in nanoscience donot aggressively encourage highly specula-tive and risky research. At least one experi-mentalist at this workshop complained thathigh-risk and speculative theoretical andcomputational efforts are too rare in thisfield, and that sentiment was echoed by anumber of theorists.

  • 7E. Consensus Observations

    A broad consensus emerged at the workshopon several key observations.

    The role of theory, modeling, and simu-lation in nanoscience is central to thesuccess of the National NanotechnologyInitiative.

    The time is right to increase federal in-vestment in theory, modeling, andsimulation in nanoscience to acceleratescientific and technological discovery.

    While there are many successful theo-retical and computational efforts yield-ing new results today in nanoscience,there remain many fundamental intel-

    lectual and computational challengesthat must be addressed to achieve thefull potential of theory, modeling, andsimulation.

    New efforts in applied mathematics,particularly in collaboration with theo-rists in nanoscience, are likely to play akey role in meeting those fundamentalchallenges as well as in developingcomputational algorithms that will be-come mainstays of computational nano-science in the future.

    F. Opportunity for an Expanded Role for the Department of Energy

    The time is ripe for an initiative in theoryand modeling of nanoscale phenomena notonly because of the magnitude of the poten-tial payoff, but also because the odds ofachieving breakthroughs are rapidly im-proving.

    Computational simulation is riding ahardware and software wave that has led torevolutionary advances in many fields repre-sented by both continuous and discrete mod-els. The ASCI and SciDAC initiatives of theDOE have fostered capabilities that makesimulations with millions of degrees of free-dom on thousands of processors for tens ofdays possible. National security decisionsand other matters of policy affecting the en-vironment and federal investment in uniquefacilities, such as lasers, accelerators, toka-maks, etc. are increasingly being reliablyinformed by simulation, as are corporate de-cisions of similar magnitude, such as whereto drill for petroleum, what to look for innew pharmaceuticals, and how to designbillion-dollar manufacturing lines. Universi-

    ties have begun training a new generation ofcomputational scientists who understandscientific issues that bridge disciplines fromphysical principles to computer architecture.Advances in extensibility and portabilitymake major investments in reusable soft-ware attractive. Attention to validation andverification has repaired credibility gaps forsimulation in many areas.

    Nanotechnologists reaching towardsimulation to provide missing understandingwill find simulation technology meeting newchallenges with substantial power. However,theorists and modelers working on the inter-face of nanotechnology and simulation tech-nology are required to make the connec-tions.

    The workshop focused considerable at-tention on the role of applied mathematics innanoscience. In his presentation, one of thescientists working on molecular dynamicsstudies of objects and phenomena at thenanoscale expressed the sentiment of the

  • 8workshop effectively by saying that the roleof applied mathematics should be to maketractable the problems that are currently im-possible. As mathematics is the language inwhich models are phrased, developments inapplied mathematics are central to the suc-cess of an initiative in theory and modelingfor nanoscience. A key challenge in nano-science is the range of length and time scalesthat need to be bridged. It seems likely thatfundamentally new mathematics will beneeded to meet this challenge.

    The diverse phenomena within nano-science will lead to a plethora of modelswith widely varying characteristics. For thisreason, it is difficult to anticipate all the ar-eas of mathematics which are likely to con-tribute, and serendipity will undoubtedlyplay a role. As models and their supportingmathematics mature, new algorithms will beneeded to allow for efficient utilization andapplication of the models. These issues andsome significant areas of activity are dis-

    cussed in Section 3 of this report. But thisdiscussion is not (and cannot be) fully com-prehensive due to the breadth of nanoscienceand the unknowable paths that modeling willfollow.

    The Department of Energy is uniquelysituated to build a successful program intheory, modeling, and simulation in nano-science. Much of the experimental work innanoscience is already supported by the De-partment, and new facilities are being built.The Department also has an internationallyregarded program in applied mathematics,and much of the foundational work onmathematical modeling and computation hasemerged from DOE activities. Finally, theDepartment has unique resources and expe-rience in high performance computing andalgorithms. The conjunction these areas ofexpertise make the Department of Energy anatural home for nanoscience theory andmodeling.

    G. Need for Computational Resources and Readiness to Use Them

    The collection of algorithms and computa-tional approaches developed over the last 15years that form the basis of the revolution inmodeling and simulation in areas relevant tonanoscience have made this community in-tense users of computing at all levels, in-cluding the teraflop/s level. This field hasproduced several Gordon Bell Prize winnersfor fastest application code, most recentlyin the area of magnetic properties of materi-als. That work, by a team led by MalcolmStocks at Oak Ridge National Laboratory, isonly one example of computing at the tera-scale in nanoscience.

    The workshop did not focus much dis-cussion specifically on the need for compu-tational resources, since that need is ubiqui-tous and ongoing. The presentations showedcomputationally intensive research using the

    full range of resources, from massively par-allel supercomputers to workstation andcluster computing. The sentiment that notenough resources are available to the com-munity, at all levels of computing power,was nearly universally acknowledged. Thiscommunity is ready for terascale computing,and it needs considerably more resources forworkstation and cluster computing.

    A significant amount of discussion in thebreakout sessions focused on scalable algo-rithms, meaning algorithms that scale wellwith particle number or dimension as well aswith increasing size of parallel computinghardware. The simulation and modelingcommunity in nanoscience is one of themost computationally sophisticated in all ofthe natural sciences. That fact was demon-strated in the talks and breakout sessions of

  • 9the workshop, and is displayed particularlyin the sections of this report devoted to fast

    algorithms and well characterized nanobuilding blocks.

    H. Summary of Specific Challenges and Opportunities

    The sections that follow identify a largenumber of challenges in theory, modeling,and simulation in nanoscience together withopportunities for overcoming those chal-lenges. Because an assembly of 50 scientistsand applied mathematicians is too small anumber to represent all the active areas ofnanoscience and related mathematics, thelist compiled at the workshop is necessarilyincomplete. However, a summary of eventhe partial catalog accumulated during theworkshop and described briefly in the fol-lowing sections should make a compellingcase for investment in theory, modeling, andsimulation in nanoscience. The challengesand opportunities include:

    To determine the essential science oftransport mechanisms, including electrontransport (fundamental to the functional-ity of molecular electronics, nanotubes,and nanowires), spin transport (funda-mental to the functionality of spintron-ics-based devices), and molecule trans-port (fundamental to the functionality ofchemical and biological sensors, mo-lecular separations/membranes, andnanofluidics).

    To simulate with reasonable accuracythe optical properties of nanoscalestructures and to model nanoscale opto-electronic devices, recognizing that inconfined dimensions, optical propertiesof matter are often dramatically alteredfrom properties in bulk.

    To devise theoretical and simulation ap-proaches to study nano-interfaces, whichare necessarily highly complex and het-erogeneous in shape and substance, and

    are often composed of dissimilar classesof materials.

    To simulate complex nanostructures in-volving many molecular and atomic spe-cies as well as the combination of softbiological and/or organic structures andhard inorganic ones.

    To devise theoretical and simulation ap-proaches to nano-interfaces betweenhard and soft matter that will play cen-tral roles in biomolecular materials andtheir applications.

    To address the central challenge ofbridging a wide range of length and timescales so that phenomena captured inatomistic simulations can be modeled atthe nanoscale and beyond.

    To simulate self-assembly, the key tolarge-scale production of novel struc-tures, which typically involves manytemporal and spatial scales and manymore species than the final product.

    To devise theoretical and simulation ap-proaches to quantum coherence and de-coherence, including tunneling phenom-ena, all of which are central issues forusing nanotechnology to implementquantum computing.

    To devise theoretical and simulation ap-proaches to spintronics, capable of accu-rately describing the key phenomena insemiconductor-based spin valves andspin qubits.

    To develop self-validating and bench-marking methodologies for modelingand simulation, in which a coarser-grained description (whether it is

  • 10

    atomistic molecular dynamics or meso-scale modeling) is always validatedagainst more detailed calculations, sinceappropriate validating experiments willoften be difficult to perform.

    For each entry in this list, and the manythat can be added to it, there is an estab-lished state of the art together with an arrayof specific technical issues that must befaced by researchers. For all of the chal-lenges listed, there is also an array of fun-damental questions to be addressed by re-searchers in applied mathematics.

    This report is a brief synthesis of theeffort of approximately 50 experts in thenanosciences, mathematics, and computerscience, drawn from universities, industry,and the national laboratories. Following thescientific presentations and a panel discus-sion on the roles of applied mathematics andcomputer science, the principal recommen-dations of the workshop were developed in

    breakout sessions. Three of them focused onnanoscience directly:

    Well Characterized Nano BuildingBlocks

    Complex Nanostructures and Interfaces

    Dynamics, Assembly, and Growth ofNanostructures

    Three others focused on the role of ap-plied mathematics and computer science:

    Crossing Time and Length Scales

    Fast Algorithms

    Optimization and PredictabilityThere were, of course, other possible

    ways to organize the discussions, but theoutcome would likely have been the sameno matter what the organization of the top-ics. Nevertheless, the structure of this reportreflects this particular selection of categoriesfor the breakout sessions.

  • 11

    Giant Magnetoresistance in Magnetic StorageThe giant magnetoresistance (GMR) effect wasdiscovered in 1988 and within a decade was inwide commercial use in computer hard disks(Figure 3) and magnetic sensors. The technol-ogy significantly boosted the amount of informa-tion that could be recorded on a magnetic sur-face (Figure 4). The unprecedented speed ofapplication (less than 10 years from discovery todeployment) resulted largely from advances intheory and modeling that explained the micro-scopic quantum-mechanical processes respon-sible for the GMR effect.

    Figure 3. GMR and MR head structures. (IBM)

    Figure 4. Magnetic head evolution. (IBM)

    The effect is observed in a pair of ferromag-netic layers separated by a (generally) nonmag-netic spacer. It occurs when the magnetic mo-ment in one ferromagnetic layer is switched frombeing parallel to the magnetic moment of theother layer to being antiparallel (Figure 5). Whena read head senses a magnetic bit on the stor-age medium, it switches the magnetic momenton one layer and measures the resistance, thussensing the information on the disk.

    Soon after the discovery of GMR, the phe-nomenon was seen to be related to the differentresistances of electrons having different spins

    iaRantiparallelipRparallel iaRantiparallel iaRantiparallelipRparallel ipRparallel

    Figure 5. Schematic of GMR indicating change inresistance accompanying magnetization reversalupon sensing an opposing bit. (IBM)

    (up or down). In fact, the magnetic moment itselfresults from an imbalance in the total number ofelectrons of each spin type. Modern quantum-mechanical computational methods based ondensity functional theory (for which Walter Kohnwas awarded the 1998 Nobel Prize) then pro-vided a detailed picture of the electronic struc-ture and electron (spin) transport in these sys-tems. Indeed, some unexpected effects, such asspin-dependent channeling, were first predictedon the basis of first-principles calculations andwere only later observed experimentally.

    In fact, GMR is only one aspect of the richphysics associated with the magnetic multilayersnow used in read heads. Equally important areoscillatory exchange coupling (which is used toengineer the size of the magnetic field requiredto switch the device) and exchange bias (whichis used to offset the zero of the switching field inorder to reduce noise). In exchange coupling, adetailed understanding of the mechanisms re-sponsible has been obtained on the basis offirst-principles theory. Less is known about ex-change bias, but significant progress has beenmade on the basis of simplified models and withfirst-principles calculations of the magnetic struc-ture at the interface between a ferromagnet andan antiferromagnet.

    Impressive as these advances in theory andmodeling have been, their application to nano-magnetism has only just begun. New synthesistechniques have been discovered for magneticnanowires, nanoparticles, and molecular mag-nets; magnetic semiconductors with high Curietemperatures have been fabricated; and spin-polarized currents have been found to drivemagnetic-domain walls. When understoodthrough theory and modeling, these findings alsoare likely to lead to technological advances andcommercial applications.

  • 12

    II. Theory, Modeling, and Simulation in NanoscienceThe challenges presented by nanoscienceand nanotechnology are not simply re-stricted to the description of nanoscale sys-tems and objects themselves, but extend totheir design, synthesis, interaction with themacroscopic world, and ultimately large-scale production. Production is particularlyimportant if the technology is to becomeuseful in society. Furthermore, the experi-ence of the participants in the workshopfrom the engineering community stronglysuggests that a commercial enterprise willnot commit to large-scale manufacture of aproduct unless it can understand the materialto be manufactured and can control the pro-cess to make products within well-definedtolerance limits.

    For macroscopic systemssuch as theproducts made daily by the chemical, mate-rials, and pharmaceutical industriesthatknowledge is often largely or exclusivelyempirical, founded on an experimental char-acterization over the ranges of state condi-tions encountered in a manufacturing proc-ess, since macroscopic systems are intrinsi-cally reproducible. Increasingly, however,this characterization is based on molecularmodeling, including all of the tools relevantto modeling nanoscale systems, such aselectronic structure, molecular simulation,and mesoscale modeling methods. Indeed,the report Technology Vision 2020: The U.S.Chemical Industry1 has identified molecularmodeling as one of the key technologies thatthe chemical industry needs to revolutionizeits ability to design and optimize chemicalsand the processes to manufacture them.

    Molecular modeling already plays amajor role in the chemical, materials, and

    1 American Chemical Society et al., 1996, http://membership.acs.org/i/iec/docs/chemvision2020.pdf.

    pharmaceutical industries in the design ofnew products, the design and optimizationof manufacturing processes, and the trouble-shooting of existing processes. However, theexquisite dependence on details of molecu-lar composition and structures at the nano-scale means that attempting to understandnanoscale systems and control the processesto produce them based solely on experi-mental characterization is out of the ques-tion.

    The field of nanoscience is quite broadand encompasses a wide range of yet-to-be-understood phenomena and structures. It isdifficult to define nanoscience precisely, buta definition consistent with the NationalNanotechnology Initiative is:

    The study of structures, dynamics,and properties of systems in whichone or more of the spatial dimen-sions is nanoscopic (1100 nm),thus resulting in dynamics andproperties that are distinctly differ-ent (often in extraordinary and un-expected ways that can be favora-bly exploited) from both small-molecule systems and systems mac-roscopic in all dimensions.

    Rational fabrication and integration ofnanoscale materials and devices offers thepromise of revolutionizing science and tech-nology, provided that principles underlyingtheir unique dynamics and properties can bediscovered, understood, and fully exploited.However, functional nanoscale structuresoften involve quite dissimilar materials (forexample, organic or biological in contactwith inorganic), are frequently difficult tocharacterize experimentally, and must ulti-mately be assembled, controlled, and util-ized by manipulating quantities (e.g., tem-perature, pressure, stress) at the macroscale.

  • 13

    This combination of features puts un-precedented demands on theory, modeling,and simulation: for example, due to nano-scale dimensionality, quantum effects areoften important, and the usual theories valideither for bulk systems or for small mole-cules break down. Qualitatively new theo-ries are needed for this state of matter inter-mediate between the atomic scale and alarge-enough scale for collective behavior totake over, as well as methods for connectingthe nanoscale with finer and coarser scales.

    Within the context of this definition ofnanoscience and the need for significant ad-vances in our understanding of the nano-scale, three breakout sessions on theory,modeling, and simulation in nanoscale sci-ence considered three broad classes of nano-systems:

    The first class consisted of nano buildingblocks (such as nanotubes, quantumdots, clusters, and nanoparticles), someof which can be synthesized quite repro-ducibly and well characterized experi-mentally.

    The second class consisted of complexnanostructures and nano-interfaces,2reflecting the importance of nano-interfaces in a wide variety of nanoscalesystems. This session was specificallyasked to consider steady-state propertiesof complex nanostructures and nano-interfaces.

    2 We can distinguish between interfaces in generaland nano-interfaces as follows: Typically, an inter-face separates two bulk phases; we define, consistentwith the above definition, a nano-interface as one inwhich the extent of one or more of the phases beingseparated by the interfaces is nanoscopic. In nano-interfaces we include nano-interconnects, which jointwo or more structures at the nanoscale. For nano-interfaces, traditional surface science is generally notapplicable.

    The focus of the third session was dy-namics, assembly, and growth of nano-structures, and so was concerned withthe dynamical aspects of complex nano-structures. Hence, transport properties(such as electron and spin transport andmolecular diffusion) of complex nano-structures, as well as the dynamic proc-esses leading to their creation, particu-larly self-assembly, were central to thissession.

    The reports presented below are summa-ries drafted by the session chairs. Given theostensibly different charges presented toeach group, there is remarkable unanimity intheir conclusions with respect to the out-standing theory, modeling, and simulationneeds and opportunities. Among the com-mon themes are:

    the need for fundamental theory of dy-namical electron structure and transport

    algorithmic advances and conceptualunderstanding leading to efficient elec-tronic structure calculations of largenumbers of atoms

    new theoretical treatments to describenano-interfaces and assembly of nano-structures

    fundamentally sound methods forbridging length and time scales and theirintegration into seamless modeling envi-ronments

    commitment to an infrastructure forcommunity-based open-source codes.

    These common themes have been incor-porated into the recommendations and chal-lenges outlined in Section I of this report.

  • 14

    A. Nano Building Blocks

    Well characterized building blocks for nano-science need to be created and quantitativelyunderstood for a number of crucial reasons.A key aspect of construction, whether at themacroscopic or microscopic scale, is the no-tion of a building block. Office buildingsare often made from bricks and girders,molecular components from atoms. Just asknowledge of the atom allows us to makeand manipulate molecular species, knowl-edge of bricks and mortar allows us to con-struct high-rise buildings.

    Well-characterized nano building blockswill be the centerpiece of new functionalnanomechanical, nanoelectronic, and nano-magnetic devices. A quantitative under-standing of the transport, dynamics, elec-tronic, magnetic, thermodynamic, and me-chanical properties is crucial to building atthe nanoscale. As an example, current theo-retical approaches to nanoelectronics oftencannot accurately predict I-V curves for thesimplest molecular systems. Without a co-herent and accurate theoretical descriptionof this most fundamental aspect of devices,progress in this area will necessarily be lim-ited.

    Theoretical studies for these systemswill necessarily have to be based on ap-proximate methods; the systems are simplytoo large and too complex to be handledpurely by ab initio methods. Even if theproblems are tractable, we will make muchgreater progress with well-validated ap-proximate models. Thus a variety of bench-marking methodologies and validation test-ing are needed to establish the accuracy andeffectiveness of new models and approxi-mate theory.

    At the nanoscale, what are the nanobuilding blocks that would play an analo-gous role to macro building blocks? Several

    units come to mind which range from clus-ters less than 1 nm in size to nanoparticles,such as aerosols and aerogels much largerthan 100 nm. However, we believe the best-characterized and most appropriate buildingblocks are:

    clusters and molecular nanostructures

    nanotubes and related systems

    quantum wells, wires, films and dots.These blocks are well defined by ex-

    periment and frequently tractable usingcontemporary theory. Moreover, they havebeen demonstrated to hold promise in exist-ing technology. We believe that buildingblocks would have an immediate impact inthe five areas described below.

    Transport in Nanostructures:Electronic DevicesAny electronic device needs to control andmanipulate current. At the nanoscale, newphenomena come into play. Classical de-scriptions become inoperable compared toquantum phenomena such as tunneling,fluctuations, confined dimensionality, andthe discreteness of the electric charge. Atthese dimensions, the system will be domi-nated entirely by surfaces. This presentsfurther complications for electronic materi-als, which often undergo strong and com-plex reconstructions. Defining accuratestructure at the interface will be difficult.However, without an accurate surfacestructure, it will be even more difficult tocompute any transport properties. Moreover,current-induced changes within the structuremay occur. This situation will require thecomplexity, but not intractable approach, ofa careful self-consistent solution in a non-equilibrium environment.

  • 15

    Optical Properties on the Nanoscale:Optoelectronic DevicesAt confined dimensions, optical propertiesof matter are often dramatically altered. Forexample, silicon is the dominant electronicmaterial, but it has poor optical propertiesfor optoelectronic devices such as solar cellsor lasers. However, at small dimensions theproperties of silicon can be dramatically al-tered; the band gap in silicon can be blue-shifted from the infrared to the optical re-gion. One of the first manifestations of thiseffect occurs in porous silicon, which ex-hibits remarkable room temperature lumi-nescence. To properly capitalize on suchphenomena, a deeper quantitative under-standing of the optical properties of matterwill be required.

    Optical excitations are especially chal-lenging because most commonly used meth-ods for structural energies, such as densityfunctional theory, are not well suited for ex-cited state properties. The problem is exac-erbated for nanoscale systems, where themany-body effects are enhanced by thephysical confinement of the excitation.

    Coherence/Decoherence Tunneling:Quantum ComputingWhile silicon dominates electronic devices,there is a fundamental limit to this technol-ogy. As the size of a transistor is reducedeach year, eventually one will approach thenew size regimes where traditional elec-tronics must be viewed in different terms.The approach to this regime will happen inthe not too distant future. Current litho-graphic techniques can create semiconductorchips with characteristic lengths of only afew hundred nanometers; quantum effectsare likely to dominate at tens of nanometers.At this latter length scale, transistor technol-ogy will be solidly in the regime of quantumphenomena.

    Components will eventually functionsolely through quantum effects, for whichwe will need the capability to accuratelysimulate or predict in order to control. It hasbeen suggested that new computers can bedevised at nanoscales that will function byusing the creation of coherent quantumstates. By creating and maintaining such co-herent states, it should be possible to storeenormous amounts of information with un-precedented security. In order to constructsuch new quantum computers, new fabrica-tion methods will need to be found andguided by theory. Current technology is farremoved from the routine generation ofquantum logic gates and quantum networks.Without a better quantitative description ofthe science at the nanoscale, the emergenceof quantum computing will not happen.

    Soft/Hard Matter Interfaces: BiosensorsNanotechnology has the potential to makegreat advances in medical technology andbiosecurity. The technologies for controllingand processing materials for semiconductordevices can be used to make types of nano-interfaces. Nanostructure synthesis andjoining of biological or soft matter with tra-ditional semiconductors like silicon wouldoffer new pathways to the construction ofnovel devices. For example, one could gainthe ability to develop new types of sensorsdirectly integrated into current device tech-nology. A number of clear applications suchas biosensors for national security, drug de-livery and monitoring, and disease detectionare plausible.

    However, numerous complex and diffi-cult issues must be addressed. These includequantitative descriptions of nanofluids, re-action kinetics, and a selectivity to particularbiological agents. Perhaps no systems are ascomplex at the nanoscale as these, becausethey require the understanding of the detailsof interactions between quite diverse sys-tems, i.e., soft, biological materials inter-

  • 16

    faced with semiconductors or other inorganic solids.

    Spintronics: Information TechnologySpintronics centers on controlling the spin ofelectrons in addition to controlling othertransport properties. For example, dilutemagnetic semiconductors have attractedconsiderable attention, because they hold thepromise of using electron spin, in addition tocharge, for creating a new class of spintronicsemiconductor devices with unprecedentedfunctionality. Suggested applications includespin field effect transistors, which could al-low for software reprogramming of the mi-croprocessor hardware during run time;semiconductor-based spin valves, whichwould result in high-density, non-volatilesemiconductor memory chips; and even spinqubits, to be used as the basic building blockfor quantum computing.

    At the nanoscale, spintronic devices of-fer special challenges owing to the enhancedexchange terms arising from quantum con-finement. Few quantitative studies exist onmagnetic properties of quantum dots and theevolution of ferromagnetic properties withsize.

    Implications for Theory and ModelingThese five areas of research offer greatpromise but will require new theories (ap-proximate models) and computationally in-tensive studies. Current algorithms must bemade more efficient and sometimes moreaccurate, or new algorithms must emerge inorder to address the challenges highlightedabove. For example, ab initio methods canaddress systems with hundreds of atoms,and empirical methods can handle tens ofthousands of atoms. To fully exploit theseapproaches, algorithms need to be madescalable and take full advantage of currentcomputer technology, i.e., highly parallelenvironments. Fully ab initio methods are

    especially useful in providing benchmarksfor these building blocks, where experi-mental data is lacking or is unreliable due tounknown variability or uncontrolled criticalfunction characteristics. Embedding meth-ods whereby one can take the best featuresof a particular approach will be crucial.

    Algorithms for particular aspects ofnanoscale systems need to be developed. Forexample, ab initio methods for optical prop-erties such as GW and Bethe-Salpeter meth-ods are known to be accurate and robust, yetthese systems are limited in their applicabil-ity to rather small systems, e.g., to systemsof less than 50 atoms. The essential physicalfeatures of these methods need to be adaptedto address much larger systems.

    Nonequilibrium and quantum dynamicspresent special problems. These systems of-ten lack clear prescriptions for obtaining re-liable results. Some recent attempts to matchexperiments on electron transport throughmolecular systems often get the correcttransport trends, but not quantitative num-bers. A number of entanglements exist: Isthe experimental data reliable owing toproblems with contacts? Or is there somefundamental component missing from ourunderstanding? New experiments will needto be designed to ensure reproducibility andthe validity of the measurements. New theo-ries will need to be constructed and cross-checked by resorting to the fundamentalbuilding blocks.

    A chronic and continuing challengearising in most nanosystems concerns a lackof understanding of structural relationships.Often nanosystems have numerous degreesof freedom that are not conducive to simplestructural minimizations. Simulated anneal-ing, genetic algorithms, and related tech-niques can be used for some systems, but ingeneral it will require new optimizationtechniques in order to obtain a quantitative

  • 17

    description of the structure of nanoscalesystems.

    While great strides have been made insimulation methods, a number of funda-mental issues remain. The diversity of timeand length scales remains a great challengeat the nanoscale. Rare event simulationmethods will be essential in describing anumber of phenomena such as crystalgrowth, surface morphologies, and diffu-sion. Of course, the transcription of quantumforce fields to classical interatomic poten-tials is a difficult task. Intrinsic quantum at-

    tributes like hybridization and charge trans-fer remain a challenge to incorporate intoclassical descriptions. At best, current clas-sical-like descriptions can be used for tar-geted applications.

    In order to facilitate dissemination ofnew algorithms and numerical methods,open-source software should be encouragedand made widely available. Toolkits and ef-ficient codes should be offered without re-strictions for distribution to as wide an audi-ence as possible.

    B. Complex Nanostructures and Interfaces

    Central to nanoscience is the assembly andmanipulation of fundamental buildingblocks on the nanoscale to create functionalstructures, materials, and devices (Figure 6).Unlike traditional materials, nanostructuredmaterials that will enable the nanotechnol-ogy revolution will be highly complex andheterogeneous in shape and substance, com-posed of dissimilar materials, and dominatedby nano-interfaces (Figure 7). These char-acteristics require that new theoretical ap-proaches and computational tools be devel-oped to provide a fundamental understand-ing of complexity at the nanoscale, thephysics and chemistry of nano-interfaces,and how interfaces and complexity at thenanoscale control properties and behavior atlarger scales.

    Nano-interfaces and complexity at thenanoscale play a central role in nearly allaspects of nanoscience and nanotechnology.For example, in molecular electronic de-vices, DNA and other biomolecules such asproteins are being explored both as assem-blers of molecular components like carbonnanotubes and semiconductor quantum dots(Figure 8), and as active components them-selves (Figures 9 and 10). In such devices,soft biomolecular matter and hard matter

    will come together at complex nano-interfaces across which transport of chargemay occur, or which may provide structuralstability. In new spintronic devices, thenano-interfaces between ferromagnetic andantiferromagnetic domains may control thespeed of switching needed for faster com-puters. In polymer nanocomposites, wherenanoscopic particles are added to a polymer,interactions at the polymer/particle nano-interface can have profound effects on bulkproperties, leading to stronger yet lighter-weight structural and other materials. Themanipulation of interfaces in soft mattersystems at the nanoscale provides a wealthof opportunities to design functional, nano-structured materials for templating, scaffoldsfor catalysis or tissue growth, photonic de-vices, and structural materials (Figure 11).

    The high surface-to-volume ratio result-ing from the prevalence of nano-interfaces,combined with the complexity of thesenano-interfaces and of nanostructures, pro-vides much of the challenge in developingpredictive theories in nanoscience. In thisrespect, materials designed at the nanoscaleare distinctly different from traditional bulkmaterials. One of the largest driving forcesbehind the need for new theory and simula-

  • 18

    Figure 6. Three-dimensional view of CharlesLieber's concept for a suspended crossbar arrayshows four nanotube junctions (device elements),with two of them in the on (contacting) state andtwo in the off (separated) state. The bottom nano-tubes lie on a thin dielectric layer (for example,SiO2) on top of a conducting layer (for example,highly doped silicon). The top nanotubes are sus-pended on inorganic or organic supports (grayblocks). Each nanotube is contacted by a metal elec-trode (yellow blocks). (IBM)

    Figure 7. Schematic of an assembled structure ofnanoscopic building blocks tethered by organiclinkers. The structure is dominated by interfacialconnections. (S. C. Glotzer, 2002)

    Figure 8. Top: Schematic of nanoscale buildingblocks tethered by DNA. The soft/hard interfacein this complex nanostructure controls the transportof information through the structure. Bottom:Schematic of nanoscopic gold nanoparticles assem-bled onto a surface with DNA. Such guided assem-bly can be used, e.g., to position the nanoparticlesfor fabrication into a nanowire.

    Figure 9. Atomistic simulation of a nanostructurecomposed of polyhedral oligomeric silsesquioxanecages. (J. Kieffer, 2002)

    Figure 10. Simulation of organic oligomers attractedto the surface of a nanoscopic quantum dot. (F. W.Starr and S. C. Glotzer, 2001)

    Figure 11. Simulation of nanostructured domains ina polymer blend where the interfaces are controlledat the nanoscale. (S. C. Glotzer, 1995)

  • 19

    tion in nanoscience is the fact that there arefew experimental techniques capable of di-rectly imaging or probing nano-interfaces.Typically, those techniques having atomic-scale resolution require extensive theory toproduce more than qualitative information.Indeed, it is likely that the fundamental in-terfacial and spatial information required topredict the behavior and performance ofnanostructures will come from simulationand theory in many cases.

    Focused theory and simulation at a widerange of scales is needed to develop a quan-titative understanding of nano-interfaces andtheir consequent influence on properties andemergent behavior of nanoscale materialsand devices. Breakthroughs in theory andsimulation will also be necessary in order todescribe and predict the complexity of na-nostructures and the relationship betweenstructural complexity and emergent proper-ties and behavior.

    A major challenge for nanoscience inwhich theory, modeling, and simulation willplay a pivotal role lies in the definition ofthe nano-interface, which is nontrivial be-cause the definition may depend upon theproperties or behavior of interest. For exam-ple, the relevant nano-interface leading to aparticular mechanical behavior may be dif-ferent than that for transport properties, suchas electrical transport. Strategies for ad-dressing nano-interfaces will also be needed.With an appropriate definition of nano-interfaces, new approaches can be developedthat characterize relevant nano-interfaces inorder to construct quantitatively predictivetheories that relate the nano-interfaces tointeresting nanoscale properties and behav-ior.

    To develop theories and quantitative un-derstanding of nano-interfaces, it will benecessary to obtain information on the or-ganization of matter at nano-interfaces, thatis, the arrangement, composition, conforma-

    tion, and orientation of matter at hard-hard,hard-soft, and soft-soft nano-interfaces.Much of this information may only be ac-cessible by simulation, at least for the fore-seeable future. The geometry and topologyof nano-interfaces, bonding at nano-interfaces, structure and dynamics at or neara nano-interface, transport across nano-interfaces, confinement at nano-interfaces,deformation of nano-interfaces, activity andreactivity at nano-interfaces, etc. must all beinvestigated by simulation. Such informa-tion is needed before very accurate theoriesrelating all of these to material and deviceoperations and behavior will be fully for-mulated and validated.

    For example, to properly design, opti-mize and fabricate molecular electronic de-vices and components, we must quantita-tively understand transport across soft/hardcontacts, which in turn will likely depend onthe arrangement of components on the sur-face and the complexity of structures thatcan be assembled in one, two, and three di-mensions. Comprehending transport willlikely require a quantitative understandingof the nature of nano-interfaces betweenbiological matter and hard matter, betweensynthetic organic matter and hard matter,and between biological matter and syntheticorganic matter. It will also require optimiza-tion of structures and tailoring of nano-interfaces, which in turn will require aquantitative understanding of the stability ofnano-interfaces.

    As we move towards the goal of first-principles-based fully theoretical predictionof the properties of nanoscale systems, thereis an urgent need to critically examine theapplication of bulk thermodynamic and sta-tistical mechanical theories to nanostructuresand nano-interfaces, where violations of thesecond law of thermodynamics have beenpredicted and observed, and where mechani-cal behavior is not described by continuumtheories.

  • 20

    Theories are required that abstract frombehavior at a specific nano-interface to pre-dict the behavior of extended structures ormaterial that is comprised of primarily nano-interfaces. Additionally, when describingnanostructures that are collections of acountable number of fundamental buildingblocks, one must question whether the con-cept of temperature has the same meaning asin a bulk system. An important task for theo-reticians and simulationists will be to iden-tify what can be brought to nanosciencefrom traditional methods of investigationand description, and then determine whatnew theories or approaches must be invokedat the nanoscale.

    When materials or devices are domi-nated by nano-interfaces, new physics islikely to result. Examples of nano-interface-dominated physics include the electric fieldat surfaces, luminescence of rare-earthorthosilicates, surface-enhanced Ramanscattering, and deformation of nanocrystal-line materials. Nanocomposites and nano-assemblies are just two examples of nano-structures where nano-interfaces are preva-lent and may dominate behavior at largerscales. The high surface-to-volume ratio ofcomplex nanostructures may also lead tonovel collective or emergent phenomena,which would require new theory and simu-lation efforts to address and understand.

    Because of the present lack of theoriesto fully describe nano-interfaces at thenanoscale and the ramifications of nano-interfaces on larger scales, and the lack ofexperimental data on nano-interfaces, simu-lations will play a pivotal role in advancingknowledge in this area. Ab initio calcula-tions in particular will be required, but the

    present state of the art limits the number ofatoms to numbers too small to obtainmeaningful data in many instances. Conse-quently, breakthroughs in methodologieswill be required to push ab initio simulationsto larger scales. Ab initio simulations con-taining of the order of 10,000 atoms wouldenable, for example, the study of spinstructure at antiferromagnetic-ferromagneticnano-interfaces.

    New or improved computational meth-ods are needed to simulate also the dynam-ics of nano-interfaces from first principles.Such simulations will be needed to predict,for example, electronic transport acrossnano-interfaces, as well as to provide vali-dation or benchmarks for coarser-grainedtheoretical descriptions of nano-interfaces inwhich electronic structure is not explicitlyconsidered. For molecular dynamics meth-ods and other classical, particle-based meth-ods, new (for many purposes reactive) forcefields capable of describing nano-interfacesbetween dissimilar materials are urgentlyneeded. For some problems, further coarse-graining will be required to capture com-plexity on all relevant length and timescales, and thus new mesoscale theories andcomputational methods will be needed. Toconnect the properties and behavior of com-plex nanostructures and/or nano-interfacesto properties and behavior at the macroscale,it may be necessary to develop new consti-tutive laws (and ask if it is even appropriateto talk about constitutive laws in systemswhere properties and behavior are domi-nated by nano-interfaces) and then developand integrate theories at different levels.

  • 21

    C. Dynamics, Assembly, and Growth

    The focus of this section is the time-dependent properties of nanostructures (suchas transport properties) and the time-dependent processes used to produce nano-structures and nanostructured materials(such as directed self assembly, nucleationand growth, and vapor deposition methods).

    The primary route to manufacturablefunctional nanoscale systems is universallyconceded to be self-assembly, which maybe directed or not. Self-assembly of nano-structured materials is scalable to industriallevels of production because it does not re-quire control at the nanoscale (such as, forexample, manipulation via AFM tip). Thereis a wide variety of transport mechanismsrelevant to nanoscience, including electrontransport (relevant to molecular electronics,nanotubes, and nanowires), spin transport(relevant to spintronics-based devices, seeFigure 12), and molecule transport (relevantto chemical and biological sensors, molecu-lar separations/membranes, and nanoflu-idics).

    Examples of nanostructures and nano-structured systems formed by (or capable offormation by) self-assembly include quan-tum dot arrays, nano- and microelectrome-chanical systems (NEMS and MEMS),nanoporous adsorbents and catalytic materi-als (produced by templated self-assembly,see Figure 13), nanocrystals, and biomimeticmaterials. This list is by no means exhaus-tive.

    The role that mathematics and computerscience researchers can play is to applymathematical and computational tools de-veloped in other contexts to the understand-ing, control, design, and optimization ofnanostructured materials and functionalsystems based on nanoscale structures. Eachof these elementsunderstanding, control,

    FipoXzaC19

    FitrisKquofmqutacotathbatr of Nanostructuresgure 13. Templated self assembly of nano-rous materialin this case, MCM-41. (From

    . S. Zhao, Synthesis, modification, characteri-tion and application of MCM-41 for VOControl, Ph.D. thesis, University of Queensland,98)

    gure 12. Spintronics devices utilize an elec-ons spin rather than its charge. Shown here an artists depiction of a proposal by Bruceane of the University of Maryland for aantum computer based on the nuclear spin

    phosphorus atoms (green), which interact byeans of spin-polarized electrons (red). Theantum properties of superposition and en-

    nglement may someday permit quantummputers to perform certain types of compu-tions much more quickly using less poweran is possible with conventional charge-sed devices. (From S. Das Sarma, Spin-

    onics, American Scientist 89, 516 (2001))

  • 22

    design, and optimizationis essential to themanufacturability of systems composed ofnanoscale objects, as noted by Stan Wil-liams of Hewlett-Packard in the opening talkof the workshop.

    Among the major challenges facingtheory, modeling, and simulation (TMS)in the dynamics, assembly, and growth ofnanostructures is simulation of self-assembly (which typically involves manytemporal and spatial scales, and many morespecies than the final product), methods forcalculating electron and spin transport prop-erties in nanoscale systems, and the proc-essing and control (particularly to achievemore uniform size distribution) of self-assembly of heterostructures composed ofhard materials (such as group IV semicon-ductors, obtained by combined ion and mo-lecular beam epitaxial growth).

    The TMS methods needed to faithfullymodel the dynamics, assembly, and growthof nanostructures consist of the usual meth-ods we associate with TMS (electronicstructure methods, atomistic simulation,mesoscale and macroscale methods) butwith additional variations specific to theseproblems. These variations include:

    Hybrid electronic structure/moleculardynamics methods scalable to very largesystems are needed to model combinedreaction and structure evolution.

    Coarse graining/time spanning methodsare crucial, owing to the large times as-sociated with self-assembly and the largemicroscopic and mesoscopic domainsthat exist in many systems.

    Since many self-assembly processes takeplace in solution, electronic structuremethods incorporating solvation are nec-essary.

    Molecular and nanoscale methods thatincorporate phase equilibria and non-

    equilibrium multiphase systems (in-cluding interfaces) are required.

    Participants also saw the necessity oftraditional finite element elastic/plasticcodes for stress/strain description at themacroscale, since ultimately nanostructuredmaterials must connect with the macro-scopic world. It was noted that upscaling(going to coarser length and time scales)typically results in the introduction of sto-chastic terms to reflect high-frequency mo-tion at smaller scales; thus, more rigorousmethods for including stochasm and moreeffective methods for characterizing andsolving stochastic differential equations arerequired.

    New developments in nonequilibriumstatistical mechanics, both of classical andquantum systems, are needed. For example,the recent discovery by nonequilibrium mo-lecular dynamics, confirmed by nonlinearresponse theory, of violations of the secondlaw of thermodynamics for nanoscale sys-tems indicates the importance of new theo-retical developments focused on nanoscalesystems. Quantifying errors in calculatedproperties was identified as a significantproblem that consists of two partscharac-terization of uncertainty due to inaccuraciesinherent in the calculation (such as limita-tions of a force field in molecular dynamicsor of a basis set in electronic structure cal-culations) and performing useful sensitivityanalyses.

    The important aspect of nanoscale sys-tems that makes them such a TMS challengeis that the characterization of uncertaintywill often have to be done in the absence ofexperimental data, since many of the prop-erty measurement experiments one wouldlike to perform on nanoscale systems areimpossible in many cases or unlikely to bedone in cases where they are possible.Hence, the concept of self-validating TMSmethods arises as a significant challenge in

  • 23

    nanoscience. By self-validating TMS meth-ods we mean that a coarser-grained descrip-tion (whether it be atomistic molecular dy-namics or mesoscale modeling, for example)is always validated against more detailedcalculations (e.g., electronic structure cal-culations in the case of atomistic moleculardynamics, and atomistic molecular dynam-ics in the case of mesoscale modeling).

    Various computational issues were iden-tified as important for computational nano-science. In particular, an open source soft-ware development framework for multiscalemodeling was identified as a real need. TheOCTA project in Japan (http://octa.jp) maypoint the way towards development of suchan open source software developmentframework, or may indeed be the appropri-ate framework on which to expand simula-tion capabilities. Visualization of nanoscalesystems is challenging, since the spatio-temporal scales cover such large orders ofmagnitude. Truly effective parallelization ofMonte Carlo was identified as an importantissue. Specifically, efficient domain decom-position parallelization of Monte Carlo onvery large systems with long range forceswas seen as an unsolved problem with rami-fications for computational nanoscience.Finally, parallel codes for any TMS methodthat scales efficiently to tens of thousands ofprocessors were targeted. This is because thehighest-performance computers are headedtowards this number of processors, yet it isnot clear that any scalable codes exist at thispoint for such large machines.

    We identified three grand-challenge-scale problems within self-assembly thatwould require theoretical, algorithmic, andcomputational advances, and which mightbe specifically targeted in any future pro-gram in computational nanoscience. Theseare the simulation of self-assembly of tem-plated nanoporous materials, self-assemblyof nanostructures directed by DNA and

    DNA-like synthetic organic macromole-cules, and directed self-assembly of quan-tum dot arrays and other nanoscale buildingblocks using physical synthesis and assem-bly methods. Each of these problems hasclear relevance to the DOE mission, creatingpossible paradigm shifts in low-energy sepa-rations, catalysis, sensors, electronic de-vices, structural and functional hybrid mate-rials, and computing. Each involves signifi-cantly different chemical and physical sys-tems, so that the range of TMS methodsneeded varies with each problem: the firstinvolves self-assembly from solution; thesecond the use of biologically or bio-inspired directed self-assembly, again insolution; while the third generally does notinvolve solutions.

    In a related example, Figure 14 shows anew computational method for producingrandom nanoporous materials by mimickingthe near-critical spinodal decompositionprocess used to produce controlled-poreglasses (CPGs). While producing modelCPGs which have similar pore distributionsand sizes to actual CPGs, the method is not afully detailed simulation of CPG synthesis.Refinement of such techniques to the pointwhere new nanoporous materials can be dis-covered and synthesized computationallywould be a paradigm-altering step forwardin nanostructured materials synthesis.

    Figure 14. Schematic of controlled-pore glassmodel preparation. The phase separation processused to prepare real materials is imitated via mo-lecular dynamics simulation, resulting in molecu-lar models with realistic pore networks and struc-tures. (After Gelb et al., Reports On Progress inPhysics, Vol. 62, 1999, p. 15731659)

  • 24

    III. The Role of Applied Mathematics and Computer Sciencein Nanoscience

    I am never content until I haveconstructed a mechanical model ofwhat I am studying. If I succeed inmaking one, I understand; other-wise I do not. . . . When you meas-ure what you are speaking aboutand express it in numbers, youknow something about it, but whenyou cannot express it in numbersyour knowledge about it is of ameagre and unsatisfactory kind.William Thompson (LordKelvin), 18241907

    Kelvins mathematical triumphalism is froma pre-quantum-mechanical era and must bediscounted as chauvinistic in view of theimmense understanding of complex phe-nomena that can be accumulated prior tosatisfactory mathematization. Such hard-won understanding of phenomena, in fact,leads eventually to their mathematization.Nevertheless, Kelvins agenda of quantita-tive understanding remains as a prerequisitefor simulation and predictive extrapolationof a system, if not for substantial under-standing.

    Given the enormous power of computa-tional simulation to transform any science ortechnology supported by a mathematicalmodel, the mathematization of any new fieldis a compelling goal. The very complexityand breadth of possibilities in nanoscienceand technology cry out for an increasing rolefor computational simulation today. Thissection, organized around three themes, in-dicates some of the directions this increasingrole might take: in bridging time and lengthscales, in fast algorithms, and in optimiza-tion and predictability. In each of these rep-

    resentative areas, there are instances oflow-hanging fruit, of problems that arereasonably well defined but whose solutionspresent significant technical challenges, andof phenomena that are not yet at all well de-fined by Kelvinistic standards.

    In a complementary way, mathematicsand simulation would be enormously stimu-lated by the challenges of nanoscale model-ing. While some rapidly developing areas ofmathematics, such as fast multipole andmultigrid algorithms, are ready to apply (infact, are already being used) in nanoscalemodeling, there is much work to do just toget to the frontier of the science in otherareas.

    It is important to keep in mind that noone can predict where the next mathematicalbreakthrough of consequence to nanosciencewill come from: it could be from computa-tional geometry or from graph theory; itmight come from changing the mathematicalstructure of existing models. Changing themathematical structure creates a new lan-guage, which often can expose an emergentsimplicity in a seemingly complex land-scape. For instance, no one could have pre-dicted the significance of space-fillingcurves (a topological fancy) to the optimallayout of data in computer memories.

    However, it is predictable that the for-ward progress of mathematical modelingwill carry first one, then another area ofnanoscience forward with itprovided thatthe minds of nanoscientists and mathemati-cians alike are prepared to experiment, tolook for the points of connection, and torecognize opportunities as they arise. Themany obvious connections described below

  • 25

    should not overshadow the potential for theemergence of perhaps more important con-nections between nanoscience and other ar-eas of mathematics in the near future.

    Hence, it will be important at this earlystage not to define too narrowly what formsof applied mathematics will be important tonanoscience applications. Furthermore,the collaborative infrastructure and cross-fertilization between projects that can befacilitated in a DOE initiative fosters crea-tivity while making recognition of otherwisenon-obvious connections and opportunitiesmore likely.

    One of the simplest initial objectives of acollaborative dialog between scientists ofthe nanoscale regime and mathematicalmodelers is to be sure that the low-hangingfruit is regularly plucked. Mathematical re-sults and software libraries to implementthemmany sponsored by the Departmentof Energyarrive in a steady stream andreadily find application in fields that tradi-tionally engage applied mathematicians,such as engineering mechanics, semicon-ductor device physics, signal processing,and geophysical modeling. The differentialand integral operator equations underlyingmathematical physics have been placed on afirm foundation. Issues of existence anduniqueness, convergence and stability ofnumerical approximation, accuracy and al-gorithmic complexity, and efficiency andscalability of implementation on theagencys most powerful computers are wellunderstood in many scientific areas.

    The variety of simulation tools availableis enormous and includes Eulerian-type andLagrangian-type adaptive methods for theevolution of fields, level sets, and particles.Complex multiscale algorithms have beendeveloped for optimal solution of equilib-rium and potential problems. Wonderfultools exist for creating low-dimensional rep-resentations of apparently high-information

    content phenomena. Stochastic modelingtechniques pull averages and higher mo-ments from intrinsically nondeterministicsystems. Moreover, the limits of applicabil-ity of these tools and techniques are wellunderstood in many cases, permitting theiruse in a cost-effective way, given a require-ment for accuracy. Sensitivity and stabilityanalyses can accompany solutions.

    The interaction of scientists and engi-neers with applied mathematicians leads tocontinual improvement in scientific andmathematical understanding, computationaltools, and the interdisciplinary training ofthe next generation of each. Due to the nov-elty of nanoscale phenomena, such interac-tion with applied mathematicians is oftenmissing. Interactions must be systematicallyimproved in existing and new areas.

    At the next level of objectives for a na-noscience-mathematics collaboration lie theproblems that are apparently well defined bynanoscientists but for which there are noroutinely practical mathematical solutionstoday. The number of particles that interact,the number of dimensions in which theproblem is set, the range of scales to be re-solved, the density of local minima in whichthe solution algorithm may be trapped, thesize of the ensemble that needs to be exam-ined for reliable statistics, or some otherfeature puts the existing model beyond therange of todays analysts and the capacitiesof todays computers. In these areas, somefundamentally new mathematics may be re-quired.

    Finally, there exists a set of problems forwhich mathematicians must come alongsidenanoscientists to help define the modelsthemselves. A critical issue for revolutionarynanotechnologies is that the components andsystems are in a size regime about whosefundamental behavior we have little under-standing. The systems are too small foreasily interpreted direct measurements, but

  • 26

    too large to be described by current first-principles approaches. They exhibit toomany fluctuations to be treated monolithi-cally in time and space, but are too few to bedescribed by a statistical ensemble. In manycases, they fall between the regimes of ap-plicability of trusted models. Perhaps exist-ing theories can be stretched. More likely,new theories and hence new mathematicalmodels will be required before robust fun-damental understandings will emerge.

    Once a theoretical model becomes atrusted companion to experiment, it canmultiply the effectiveness of experimentsthat are difficult to instrument or interpret,take too long to set up, or are simply too ex-pensive to perform. Powerful computationalengines of simulation can be called upon toproduce, to twist the words of R. W. Ham-ming, not just insight, but numbers as well.

    Those numbers are the essence of con-trol and predictability.

    A. Bridging Time and Length Scales

    Most phenomena of interest in nanoscienceand nanotechnology, like many phenomenathroughout science and engineering, demandthat multiple scales be represented formeaningful modeling. However, phenomenaof interest at the nanoscale may be extremein this regard because of the lower end ofthe scale range. Many phenomena of im-portance have an interaction with their envi-ronment at macro space and time scales, butmust be modeled in any first-principlessense at atomistic scales.

    The time scale required to resolvequantum mechanical oscillations may be assmall as a femtosecond (10-15 s), whereasprotein folding requires microseconds (arange of 109), and engineered nanosystemsmay require still longer simulation periods,e.g., creep of materials relevant to reliabilityconsiderations may be measured in years.Counting trials in probabilistic analysescontributes another time-like factor to thecomputational complexity, although thislatter factor comes with welcome, nearlycomplete algorithmic concurrency, unlikeevolutionary time. Meanwhile, spatial scalesmay span from angstroms to centimeters (arange of 108). The cube of this would becharacteristic of a three-dimensional contin-uum-based analysis. Counting atoms ormolecules in a single physical ensemble

    easily leads to the same range; Avogadrosnumber is about 1024.

    Exploiting the concurrency availablewith multiple independent ensembles multi-plies storage requirements. Space-time re-source requirements are staggering for manysimulation scenarios. Therefore, nano-science poses unprecedented challenges tothe mathematics of multiscale representationand analysis and to the mathematics of syn-thesizing models that operate at differentscales.

    Existing techniques may certainly yieldsome fruit, but we are undoubtedly in needof fundamental breakthroughs in the form ofnew techniques. Faced with similar types ofchallenges in macroscopic continuum prob-lems, applied mathematicians have devised awide variety of adaptive schemes (adapta-tion in mesh, adaptation in discretizationorder, adaptation in model fidelity, etc.).Bringing these schemes to a science hasrequired decades, and is not complete in thesense of predictable error control for a giveninvestment, e.g., for some hyperbolic prob-lems. Probably the greatest range of scalesresolved purely by mesh adaptivity to date isabout 10 decades in cosmological gravita-tion problems. Of course, this range is prac-tical only when the portion of the domaindemanding refinement is small compared to

  • 27

    the domain size required to include all of theobjects of interest or, what is often the moredemanding requirement, to graft on to relia-bly posed boundary conditions in the farfield. Migrating such adaptive schemes intoproduction software (typically throughcommunity models, sometimes in commer-cial analysis packages) lags their scientificdevelopment by at least an academic gen-eration and usually more. The challenge fornanoscience is therefore not only great, butalso urgent. Applied mathematicians beingtrained today can acquire a doctoral degreewithout ever having been exposed to nano-scale phenomena.

    The mathematical models and algo-rithms of nanoscience cover the gamut ofcontinuous and discrete, deterministic andrandom. Ultimately, all simulations are exe-cuted by deterministic programs (at least, towithin the limits of race conditions on par-allel processors) on discrete data. The mostcapable computer platforms of 2002 (costingin excess of $100M) execute at most tens ofteraflop/s (1013 floating point operations persecond) and store in fast memory at mosttens of terabytes and in disk farms at mosthundreds of terabytes. Moreover, these ma-chines do not perform efficiently for modelsat these capability extremes. A typical ex-treme problem in computational science to-day encompasses only billions of discretedegrees of freedom (filling storage largerthan this with auxiliary data, including ge-ometry, physical constitutive data, and linearalgebra workspaces) and runs for days attens to hundreds of gigaflop/s. R