Top Banner
Introduction to Parallel Computing Irene Moulitsas Programming using the Message-Passing Paradigm
38

Introduction to Parallel Computing › ~karypis › parbook...MPI Background MPI : Message Passing Interface Began in Supercomputing ’92 Vendors IBM, Intel, Cray Library writers

Feb 05, 2021

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • Introduction to Parallel Computing

    Irene MoulitsasProgramming using theMessage-Passing Paradigm

  • MPI BackgroundMPI : Message Passing InterfaceBegan in Supercomputing ’92

    VendorsIBM, Intel, Cray

    Library writersPVM

    Application specialistsNational Laboratories, Universities

  • Why MPI ?One of the oldest librariesWide-spread adoption. Portable.Minimal requirements on the underlying hardwareExplicit parallelization

    Intellectually demandingAchieves high performanceScales to large number of processors

  • MPI Programming StructureAsynchronous

    Hard to reasonNon-deterministic behavior

    Loosely synchronousSynchronize to perform interactionsEasier to reason

    SPMDSingle Program Multiple Data

  • MPI FeaturesCommunicator InformationPoint to Point communicationCollective CommunicationTopology SupportError Handling

  • Six Golden MPI FunctionsMPI is 125 functionsMPI has 6 most used functions

  • MPI Functions: Initialization

    Must be called by all processesMPI_SUCCESS

    “mpi.h”

  • MPI Functions: Communicator

    MPI_CommMPI_COMM_WORLD

  • Hello World !

  • Hello World ! (correct)

  • MPI Functions: Send, Recv

    sourceMPI_ANY_SOURCE

    MPI_StatusMPI_SOURCEMPI_TAGMPI_ERROR

  • MPI Functions: Datatypes

  • Send/Receive Examples

  • Blocking Non-Buffered Communication

  • Send/Receive Examples

  • Blocking Buffered Communication

  • Send/Receive Examples

  • MPI Functions: SendRecv

  • MPI Functions: ISend, IRecv

    Non-blockingMPI_Request

  • MPI Functions: Test, Wait

    MPI_Test tests if operation finished.MPI_Wait blocks until operation is finished.

  • Non-Blocking Non-Buffered Communication

  • Example

  • Example

  • Example

  • MPI Functions: Synchronization

  • Collective CommunicationsOne-to-All BroadcastAll-to-One ReductionAll-to-All Broadcast & ReductionAll-Reduce & Prefix-SumScatter and GatherAll-to-All Personalized

  • MPI Functions: Broadcast

  • MPI Functions: Scatter & Gather

  • MPI Functions: All Gather

  • MPI Functions: All-to-All Personalized

  • MPI Functions: Reduction

  • MPI Functions: Operations

  • MPI Functions: All-reduceSame as MPI_Reduce, but all processes receive the result of MPI_Op operation.

  • MPI Functions: Prefix Scan

  • MPI Names

  • MPI Functions: Topology

  • Performance EvaluationElapsed (wall-clock) time

  • Matrix/Vector Multiply