...
2.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Node Type | Total Number of Cores | Total Amount of Memory (GB) | Compute Nodes | Cores Per node | Mem per Node (GB) | Memory per Core | Processor Type | ||
---|---|---|---|---|---|---|---|---|---|
Small Memory Nodes | 48 | 48 | 4
| 12 | 12 | 1 | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz | ||
Medium Memory Nodes | 108 | 216 | 9
| 12 | 24 | 2 | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz | ||
Large Memory Nodes | 72 | 288 | 6
| 12 | 48 | 4 | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz | ||
Extra Large Memory Nodes with GPU (see table below for more details about GPU) | 48 | 384 | 4
| 12 | 96 | 8 | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz | ||
Extra Large Memory Nodes (no GPU) | 96 | 768 | 8
| 12 | 96 | 8 | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz | ||
Special Nodes (no GPU) | 64 | 128 | 16
| 16 | 32 | 2 | Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz | ||
Special Large Memory Nodes (no GPU) | 64 | 1024 | 8
| 16 | 256 | 16 | Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz | ||
Special Large Memory Nodes (no GPU) | 192 | 1536 | 24
| 16 | 128 | 8 | Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz |
Please note that each of the Extra Large nodes (n020,n021, n022 and n023) have 2 nvidia tesla C-2050 GPU cards.
Node Type | Programming Model | Total Number of CUDA Cores | Total Amount of Memory (GB) | Compute Nodes | CUDA Cores Per node | CUDA cards per node | Mem per Node (GB) | Memory per Core | Processor Type | ||
---|---|---|---|---|---|---|---|---|---|---|---|
Extra Large Memory Nodes with GPU | GPU | 4X2X448 | 384 | 4
| 2X448 | 2 | 96 | 8 | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz |
Special Administrative Nodes (Not used for computing purposes)
Node Type | Node Name | Total Number of Cores | Total Amount of Memory (GB) | Mem per Node (GB) | Memory per Core | Processor Type |
---|---|---|---|---|---|---|
File Servers | n024,n025 | 24 | 96G | 48GB | 4GB | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz |
Test Node | testhpc | 12 | 24G | 24GB | 2GB | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz |
login Node | gowonda | 12 | 48G | 48GB | 4GB | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz |
Admin Node | n030 (admin) | 12 | 24G | 24GB | 2GB | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz |
More information about the Intel(R) Xeon(R) CPU X5650 processor can be obtained here.
In addition to the above, there is a special windows HPC node.
Node type | Total Number of Cores | Total Amount of Memory (GB) | Compute Nodes | Cores Per node | Mem per Node (GB) | Memory per Core | Processor Type | OS | ||
---|---|---|---|---|---|---|---|---|---|---|
Windows 2008 Large Memory Node | 12 | 48 | 1
| 12 | 48 | 4 | Intel(R) Xeon(R) CPU X5650 @ 2.67GHz | Windows 2008 R2 with windows HPC pack |
Instructions for using the Windows HPC is given in a separate user guide
...
The following third party applications are currently installed or will be installed shortly. The Gowonda HPC Center staff will be happy to work with any user interested in installing additional applications, subject to meeting that application's license requirements.
Software | Version | Usage | Status |
---|---|---|---|
AutoDOCK | 4.2.3 | Module load autodock423 autodockvina112 | Installed |
Bioperl |
TBI (To be Installed) | ||
Blast |
Installed | |||||
CUDA | 4.0 |
| Installed) | ||
Gaussian03 |
| Installed | ||||
Gaussian09 |
| Installed | |||
Gromacs |
Installed | |||||
gromos | 1.0.0 |
| Installed | ||
MATLAB | 2009b,2011a |
| Installed | ||
MrBayes |
TBI (To be Installed) | ||
NAMD |
module load NAMD/NAMD28b1 | Installed | ||||
numpy | 1.5.1 |
| Installed | ||
PyCogent | - |
| Installed | ||
qiime |
To be Installed | |||
R |
| Installed | ||||
SciPy | 0.9.0 |
| Installed | ||
VASP | - | - | TBI |
The following graphics, IO, and scientific libraries are also supported.
Software | Version | Usage | Status | ||
---|---|---|---|---|---|
Atlas | 3.9.39 |
| Installed | ||
FFTW | 3.2.2.,3.3a |
| Installed | ||
GSL- | 1.09,1.15 |
| Installed | ||
LAPACK- | 3.3.0 | - | - | ||
NETCDF- | 3.6.2,3.6.3,4.0,4.1.1,4.1.2 |
| Installed |
3 Support
3.1 Hours of Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..
...
Users with further questions or requiring immediate assistance in use of the systems should submit a ticket here
Support staff may be contacted via:
Griffith Library and IT help 3735 5555 or X55555
email support: Submit form
You can log cases on service desk (category: eResearch services.HPC)
...
To log in to the cluster, ssh to
No Format |
---|
gowondagc-prd-hpclogin1.rcs.griffith.edu.au |
You will need to be connected to the Griffith network (either at Griffith or through vpn from home).
...
ssh on Linux platform and mac platform . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . ..
ssh -Y gowonda-Y gc-prd-hpclogin1.rcs.griffith.edu.au
Once you are on the system, have a look around. Your home directory is stored in:
/exports/home/<SNumber>
...
No Format |
---|
Warning Do not run jobs on the login node "gowondagc-prd-hpclogin1.rcs.grifithgriffith.edu.au" Please use it for compilation and small debugging runs only. |
...
7.1 Submit Jobs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
qsub <pbscriptFile>
A pbs simple script file is as follows:
...