Top Banner

Click here to load reader

of 24

TeraGrid Institute: Allocation Policies and Best Practices David L. Hart, SDSC June 4, 2007.

Jan 17, 2018

Download

Documents

Rachel Lane

The Lingo DAC –Development Allocation Committee MRAC –Medium Resource Allocations Committee LRAC –Large Resource Allocations Committee POPS –Partnerships Online Proposal System Roaming –TeraGrid (Wide) Roaming (Access) SU –Service Unit
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript

TeraGrid Institute: Allocation Policies and Best Practices David L. Hart, SDSC June 4, 2007 The Basics Who What When Where Why How The Lingo DAC Development Allocation Committee MRAC Medium Resource Allocations Committee LRAC Large Resource Allocations Committee POPS Partnerships Online Proposal System Roaming TeraGrid (Wide) Roaming (Access) SU Service Unit The Process: Getting Started Start-up Allocations (DAC Awards) Accepted, reviewed, awarded on a continual basis Up to 30,000 SUs of TeraGrid Roaming Best for Code development, porting, testing Gathering performance data for MRAC/LRAC proposals Classroom instruction Small-scale computational needs 10 Minutes to an Allocation 1.Go to POPS 2.Create a POPS user ID. 3.Login. 4.Select New proposal type. 5.Select 0-30,000. 6.Click on DAC-TeraGrid. 7.Fill out PI Info, Proposal Info, and Resource Request screens. 8.Upload PIs CV. 9.Press Final Submission. 10 Minutes to an Allocation 1.Go to POPS 2.Create a POPS user ID. 3.Login. 4.Select New proposal type. 5.Select 0-30,000. 6.Click on DAC-TeraGrid. 7.Fill out PI Info, Proposal Info, and Resource Request screens. 8.Upload PIs CV. 9.Press Final Submission. The Process: Going MRAC (or LRAC) PIs need to be aware of the lead time for getting an MRAC or LRAC award Requires a written proposal Reviewed by domain experts LRAC More than 500,000 SUs Reviewed semi-annually Awards begin April 1, Oct. 1 MRAC Limit: 500,000 SUs Reviewed quarterly Awards begin Jan. 1, April 1, July 1, Oct. 1 The Awards One per PI Allocations made for 12- month periods Unused SUs are forfeited at the end of an award period Add users to a grant via TeraGrid User Portal Progress report required annually as part of renewal proposals and multi-year awards The Options Asking for Help Multi-year Awards Possible, but not recommended for new PIs Only Progress Reports required in subsequent years Justifications To address reviewer concerns and get more of the requested SUs Best for specific omissions (not to salvage horrible proposals) Supplements Request additional SUs during a 12-month allocation period Not for DACs! Reviewed by MRAC/LRAC members. Extensions Can extend award period an additional 6 months for cause No additional SUs! Advances UP to 10% of MRAC/LRAC request can be provided in advance The Resources: Compute Compute Also Visualization TeraGrid Resources Catalog Can request specific resource(s) or TeraGrid Roaming Except TeraGrid DACs, which are roaming only Requests made in SUs SDSCs Blue Gene The Resources: Storage Long-term disk and tape Policies evolving, but some already available for award Indiana HPSS Archive SDSC Database SDSC Collections Disk Space TeraGrid GPFS-WAN Look for announcements in this area soon The Resources: Advanced Support NEW! Dedicated TeraGrid staff assistance Limited resources MRAC/LRAC reviewers rate possible projects Extra info required for proposals The Proposal: POPS Straightforward (mostly) Once you get to the Web- based data entry forms Latest changes Supporting grant information Coming soon Better TeraGrid integration https://pops-submit.ci-partnership.org/ The Proposal: Proposal Document(s) The real key to a successful review There are page limits! Sample proposals online But now, some tips and advice Traditional v. Community MRAC/LRAC proposals are accepted in four general categories of research activities Individual investigator Research collaborations (e.g., MILC consortium) Community Projects (e.g., NEES) Community Services (e.g., ROBETTA, Gateways) The general requirements for proposals of all four types remain largely the same. Proposal Review Criteria Computational Methodology The choice of applications, methods, algorithms and techniques to be employed to accomplish the stated objectives should be reasonably justified. While the accomplishment of the stated objectives in support of the science is important, it is incumbent on proposers to consider the methods available to them and to use that which is best suited. Appropriate Use of Resources The resources chosen must be an appropriate match for the applications and methodologies to be used and must be in accordance with the recommended use guidelines for those resources Efficient Use of Resources The resources selected must be used as efficiently as is reasonably possible. To meet this criterion, performance and parallel scaling data should be provided for all applications to be used along with a discussion of optimization and/or parallelization work to be done to improve the applications. Additional Review Considerations Prior progress From prior year allocation, DAC award, or work done locally Ability to complete the work plan described (more significant for larger requests) Sufficient merit-reviewed funding Staff, both number and experience Local computing environment Other access to HPC resources (e.g., Campus centers, DOE centers) General Proposal Outline I.Research Objectives II.Codes and methods to be used III.Computational plan IV.Justification for SUs (TB-yrs) requested V.Additional considerations Note: Sections III and IV are often integrated. I. Research Objectives Traditional proposals Describe the research activities to be pursued Community proposals Describe the classes of research activities that the proposed effort will support. Keep it short: You only need enough detail to support the methods and computational plan being proposed. TIPReviewers dont want to read the proposal you submitted to NSF/NIH/etc, but they will notice whether you have merit-reviewed funding. II. Codes (Data) and Methods Very similar between traditional and community proposals. More significant if using home-grown codes. Provide performance and scaling details on problems and test cases similar to those being pursued. Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting III. Computational Plan Traditional proposals Explicitly describe the problem cases you will examine BAD: a dozen or so important proteins under various conditions GOOD: 7 proteins [listed here; include scientific importance of these selections somewhere, too]. Each protein will require [X] number of runs, varying 3 parameters [listed here] [in very specific and scientifically meaningful ways] Community proposals Explicitly describe the typical use-case(s) that the gateway supports and the type of runs that you expect users to make Describe how you will help ensure that the community will make scientifically meaningful runs (if applicable) BAD: the gateway lets users run NAMD on TeraGrid resources BETTER: users will run NAMD jobs on [biological systems like this] BETTER STILL: the gateway allows users to run NAMD jobs on up to 128 processors on problem sizes limited [in some fashion] IV. Justification of SUs (TB-yrs) Traditional proposals If youve done sections II and III well, this section should be a straightforward math problem For each research problem, calculate the SUs required based on runs defined in III and the timings in section II, broken out appropriately by resource Reasonable scaling estimates from test-case timing runs to full- scale production runs are acceptable. Clear presentation here will allow reviewers to cut time or storage in a rational fashion IV. Justification of SUs (TB-yrs) Community proposals The first big trick: Calculating SUs when you dont know the precise runs to be made a priori. In Year 2 and beyond Start with an estimate of total usage based on prior years usage patterns and estimate for coming years usage patterns (justify in Section V). From this information, along with data from sections II and III, you can come up with a tabulation of SU estimates. Year 1 requires bootstrapping Pick conservative values (and justify them) for the size of the community and runs to be made, and calculate SUs. TIPStart modestly. If you have ~0 users, dont expect the reviewers to believe that you will get thousands (or even hundreds) next year. V. Additional Considerations For traditional proposals, these are not controversial Local computing environment Other supercomputing resources Prior Progress Experience/staffing V. Additional Considerations For community proposals, these components can provide key details: Community Support and Management Plan Describe the gateway interface in terms of how it helps community burn SUs. Describe plans for growing the user community, graduating users to MRAC awards, regulating gateway hogs Progress report The actual user community and usage patterns Manuscripts produced, thanks to this service. Local computing environment Other HPC resources Questions?