Accelerator-enabled quantum chemistry: a viable path to high-throughput HPC? David M. Benoit E.A. Milne Centre for Astrophysics & G.W. Gray Centre for Advanced Materials Chemistry Building School of Physical and Mathematical Sciences The University of Hull, Cottingham Road, Kingston upon Hull HU6 7RX, UK [email protected]@dbenoit1
24
Embed
Accelerator-enabled quantum chemistry193.62.125.70/CIUK-2016/DavidBenoit.pdf · Accelerator-enabled quantum chemistry: a viable path to high-throughput HPC? David M. Benoit E.A. Milne
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Accelerator-enabled quantum chemistry: a viable path to high-throughput HPC? David M. Benoit E.A. Milne Centre for Astrophysics & G.W. Gray Centre for Advanced Materials Chemistry Building School of Physical and Mathematical Sciences The University of Hull, Cottingham Road, Kingston upon Hull HU6 7RX, UK [email protected] @dbenoit1
Computing Insight UK 2016 | Manchester | 14th December 2016
PVSCF
HPC @ Hull – Implications for the institution
• No previous institutional experience in HPC • Research-community driven process
that convinced University management • £2.1M capital investment • Strong partnerships with: – Red Oak for project management – Clustervision for hardware
• HPC@Hull focussing on future HPC technologies
Computing Insight UK 2016 | Manchester | 14th December 2016
PVSCF
Building VIPER in 50 days
Computenodesarrived!
Omni-Path installed
Firstcomputerack
VIPER!
Factorytes>ng@Clustervision
ShippingtoHullNewcooling installa>onEmptyroom…
Computing Insight UK 2016 | Manchester | 14th December 2016
PVSCF
Making it work…
• Project management is vital • Delivery @ Hull: 13 May 2016 • Go live: 28 June 2016 • Time from delivery
to first job: 47 days
Computing Insight UK 2016 | Manchester | 14th December 2016
• Full system performance (180 nodes): –Theoretical (100% efficiency): 194 TF –Observed average performance (SAT-HPL): 172 TF
Computing Insight UK 2016 | Manchester | 14th December 2016
PVSCF
VIPER runs containerised HPC
• Docker containers on each node • Improves resilience • Greater flexibility • Potential to spin up/store containers
for different configurations or applications • Performance vs bare metal (1 node) –HPL in docker: 992.5 GF –HPL bare metal: 993.5 GF
Computing Insight UK 2016 | Manchester | 14th December 2016
PVSCF
VIPER use Omni-Path as its communication fabric
• HPCC Random ring latency is low –Omni-Path (4 nodes) performance: 0.92 µs –100Gbps Infiniband performance is: ~1.25 µs
• Application performance equal or better than 100Gbps Infiniband • Network capacity (HPCC, 4 nodes) –Average G-PTRANS: 35 Gb/s –Average Random Access test: 1.10 GUPS
• Still a very “new” fabric which is likelyunder-utilised by current applications
Computing Insight UK 2016 | Manchester | 14th December 2016
PVSCF
VIPER’s 500TB parallel filesystem runs on BeeGFS
• Parallel filesystem focussing on performance, similar to Lustre • Simplicity of filesystem makes it easy to manage • VIPER implements a high-resilience design: –2 BeeGFS storage server nodes –multiple JBOD RAID6 arrays with multiple global hot
spares per file server –each node can mount storage attached to the other node