CCF (Core Computational Facility) @ UQ run by ITS / SMP
[Home] [why HPC?] [UQ Resources] [getafix] [dogmatix] [asterix] [ghost] [contacts/help]
howto: [data] [Slurm] [OpenMP/MKL] [Nvidia GPU] [Intel Phi] [MPI] [matlab] [mathematica] [FAQ]
Core Computational Facility
@ UQ School of Mathematics and Physics
Welcome to the School of Mathematics and Physics Research Computing Resources Webpages.
Our goal of the Core Computational Facility is to support cutting-edge computing
resources and training to SMP researchers before moving more demanding calculations to
the UQ and NCI computing systems.
Our facilities are system administrated by the ITS Research Infrastructure team.
System news will be posted through the email list ( smp-hpc@lists.science.uq.edu.au ) that you are automatically subscribed to once you have requested access to getafix / dogmatix. You can also subscribe yourself to http://lists.science.uq.edu.au/mailman/listinfo/smp-hpc to keep informed of our SMP system news. This email list is also a forum for Questions and Answers.
Getting Started
To start using the SMP cluster:
- Request a getafix / dogmatix cluster account from the ITS support desk attn: ITS Research Infrastructure
- Read the getafix information for CPU computing (+ 9 Nvidia Tesla V100 GPUs).
- Read the dogmatix information for GPU computing (Nvidia Teslas and
- Read the SLURM Resource Manager information (and some additional notes for converting SGE to slurm or Torque to SGE to slurm).
- getafix system usage/stats are viewable via
http://faculty-cluster.hpc.net.uq.edu.au/ganglia/ - dogmatix system usage/stats are viewable via
http://faculty-cluster.soe.uq.edu.au/ganglia/
We also have some brief howto guides here for
- why HPC? --- A brief intro guide
- UQ Resources --- what exists at UQ
- getafix --- Main SMP Cluster with up to 512GB nodes - Running Rocks Cluster OS 7
- dogmatix --- Older SMP Cluster - Running Rocks Cluster OS 6
- asterix --- (now part of dogmatix) 2011 UQ MEI/SMP purchased GPU Cluster
- ghost --- (now part of dogmatix) 2010 SMP GPU Cluster (Baumgardt group)
- Contacts --- Who to ask for help
- Latest systems overview uq-smp-computing-workshop-2019-sem1-v2.pdf from o-week 2019 sem 1 workshop
- data--- introduction to data options in SMP
- Slurm Queue--- introduction to Slurm queuing system
- OpenMP/MKL--- parallelisation via OpenMP/multi-thread processing
- Nvidia GPU--- parallelisation via CUDA and our more than 80 GPUs
- MPI--- parallelisation via MPI protocol
- Intel Phi--- parallelisation via 61-core Xeon Phi co-processors
- matlab--- howto run single/multicore calcs
- mathematica--- howto run single/multicore calcs
- FAQ--- other Frequently Asked Questions (eg. XMDS)
Acknowledgments
Please add into any publications:
Numerical simulations were performed on The University of Queensland's School of Mathematics and Physics Core Computing Facility "getafix" (optionally: with thanks to Dr. L. Elliott and I. Mortimer for computing support).
CCF@SMP Webpages Changelog:
- 17/12/19 - Errata via MPI to fix mpirun command option (Michael Bromley)
- 17/9/19 - Additions to MPI page for intel/gnu combos. (Michael Bromley)
- 13/9/19 - Rewrite of MPI page. Additions to Slurm Queue. Deleted old .sge scripts. (Michael Bromley)
- 28/8/19 - Added acknowledgment. Some additions to Slurm Queue for RDP connections. (Michael Bromley)
- 13/3/19 - Added info for the Nvidia Tesla P100's and
Tensorflow/Keras Documentation at gpu (Isaac Lenton) - 13/3/19 - Added uq-smp-computing-workshop-2019-sem1-v2.pdf o-week 2019 sem 1 workshop talk (Michael Bromley)
- 13/3/19 - Removed mention of smp-comp0x systems decommissioned 2018-2019: smp-comp01, smp-comp02, ... smp-comp06 (Michael Bromley).
- 20/9/18 - Added mathematica information for running multiple parallel calcs (Michael Bromley/Tom Stace/Ian Mortimer)
- 19/7/18 - Added uq-smp-computing-workshop-2018-sem2-v1.pdf o-week 2018 sem 2 workshop talk (Michael Bromley)
- 18/7/18 - Added data storage information (Michael Bromley)
- 18/7/18 - Added initial getafix information (Michael Bromley)
- 16/5/17 - Added dogmatix and slurm information (Ian Mortimer)
- 15/1/17 - Updated slurm information (Isaac Lenton)
- 11/3/15 - Added MPI howto (Michael Bromley)
- 7/3/15 - Created the obelix webpages (Michael Bromley)
- 2013 - A presentation on the obelix II upgrade was given obelix-upgrade-presentation.pdf (Michael Bromley)
- 2011 - asterix GPU system documentation on SMP intranet was at http://www.smp.uq.edu.au/node/1602 (Vivien Challis)
Somewhere in these pages last updated 16th December 2019. [Contacts/help]