Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

2.1 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Node Type

Total Number of Cores

Total Amount of Memory (GB)

Compute Nodes

Cores Per node

Mem per Node (GB)

Memory per Core

Processor Type

Small Memory Nodes

48

48

4

No Format
(n001-n004)

12

12

1

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

Medium Memory Nodes

108

216

9

No Format
(n005-n009,n010-n012,n019)

12

24

2

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

Large Memory Nodes

72

288

6

No Format
(n013-n018)

12

48

4

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

Extra Large Memory Nodes with GPU (see table below for more details about GPU)

48

384

4

No Format
(n020-n023)

12

96

8

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz
NVIDIA Tesla C2050

Extra Large Memory Nodes (no GPU)

96

768

8

No Format
(n031-n038)

12

96

8

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

Special Nodes (no GPU)

64

128

16

No Format
(n039-n042)

16

32

2

Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz

Special Large Memory Nodes (no GPU)

64

1024

8

No Format
(n044-n047)

16

256

16

Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz

Special Large Memory Nodes (no GPU)

192

1536

24

No Format
(aspen01-aspen12)

16

128

8

Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz

Please note that each of the Extra Large nodes (n020,n021, n022 and n023) have 2 nvidia tesla C-2050 GPU cards.

Node Type

Programming Model

Total Number of CUDA Cores

Total Amount of Memory (GB)

Compute Nodes

CUDA Cores Per node

CUDA cards per node

Mem per Node (GB)

Memory per Core

Processor Type

Extra Large Memory Nodes with GPU

GPU

4X2X448

384

4

No Format
(n020-n023)

2X448

2

96

8

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz
NVIDIA Tesla C2050

Special Administrative Nodes (Not used for computing purposes)

Node Type

Node Name

Total Number of Cores

Total Amount of Memory (GB)

Mem per Node (GB)

Memory per Core

Processor Type

File Servers

n024,n025

24

96G

48GB

4GB

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

Test Node

testhpc

12

24G

24GB

2GB

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

login Node

gowonda

12

48G

48GB

4GB

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

Admin Node

n030 (admin)

12

24G

24GB

2GB

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

More information about the Intel(R) Xeon(R) CPU X5650 processor can be obtained here.

In addition to the above, there is a special windows HPC node.

Node type

Total Number of Cores

Total Amount of Memory (GB)

Compute Nodes

Cores Per node

Mem per Node (GB)

Memory per Core

Processor Type

OS

Windows 2008 Large Memory Node

12

48

1

No Format
n029

12

48

4

Intel(R) Xeon(R) CPU X5650 @ 2.67GHz

Windows 2008 R2 with windows HPC pack

Instructions for using the Windows HPC is given in a separate user guide

...

The following third party applications are currently installed or will be installed shortly. The Gowonda HPC Center staff will be happy to work with any user interested in installing additional applications, subject to meeting that application's license requirements.

Software

Version

Usage

Status

AutoDOCK

4.2.3

Module load autodock423 autodockvina112

Installed

Bioperl

  


TBI (To be Installed)

Blast

  



Installed

CUDA

4.0

No Format
module load cuda/4.0

Installed)

Gaussian03

 


No Format
module load gaussian/g03

Installed

Gaussian09

 

No Format
module load gaussian/g09

Installed

Gromacs

  


Installed

gromos

1.0.0

No Format
module load gromos/1.0.0

Installed

MATLAB

2009b,2011a

No Format
module load matlab/2009b,module load matlab/2011a 

Installed

MrBayes

 

 


TBI (To be Installed)

NAMD

 

module load NAMD/NAMD28b1

Installed

numpy

1.5.1

No Format
module load python/2.7.1

Installed

PyCogent

-

No Format
module load python/2.7.1

Installed

qiime

  



To be Installed

R

 

No Format
module load R/2.13.0

Installed

SciPy

0.9.0

No Format
module load python/2.7.1

Installed

VASP

-

-

TBI

The following graphics, IO, and scientific libraries are also supported.

Software

Version

Usage

Status

Atlas

3.9.39

No Format
 module load ATLAS/3.9.39

Installed

FFTW

3.2.2.,3.3a

No Format
  module load fftw/3.3-alpha-intel 

Installed

GSL-

1.09,1.15

No Format
 module load  module load gsl/gsl-1.15 

Installed

LAPACK-

3.3.0

-

-

NETCDF-

3.6.2,3.6.3,4.0,4.1.1,4.1.2

No Format
 e.g. module load NetCDF/4.1.2

Installed

3 Support

3.1 Hours of Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..

...

Users with further questions or requiring immediate assistance in use of the systems should submit a ticket here 


Support staff may be contacted via:
Griffith Library and IT help 3735 5555 or X55555
email support: Submit form 
You can log cases on service desk (category: eResearch services.HPC)

...

To log in to the cluster, ssh to

No Format
gowondagc-prd-hpclogin1.rcs.griffith.edu.au

You will need to be connected to the Griffith network (either at Griffith or through vpn from home).

...

ssh on Linux platform and mac platform . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . ..

ssh -Y gowonda-Y gc-prd-hpclogin1.rcs.griffith.edu.au

Once you are on the system, have a look around. Your home directory is stored in:
/exports/home/<SNumber>

...

No Format
 Warning
Do not run jobs on the login node "gowondagc-prd-hpclogin1.rcs.grifithgriffith.edu.au" Please use it for compilation and small debugging runs only. 

...

7.1 Submit Jobs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


qsub <pbscriptFile> 


A pbs simple script file is as follows:

...