Host name: nano.ncsa.illinois.edu
8x SuperMicro SYS-4028GR-TR
128 GB DDR4 (8x 16 GB Micron 2133 MHz 36ASF2G72PZ-2G1A2)
8 PCI-E 3.0 ports, switched
- Mellanox MT27500 Family [ConnectX-3] QDR IB
- 1x 256 GB Samsung SSD 850
NFS-mounted 30TB /home (2x 6-drive RAID z2 with 4TB drives)
GlusterFS w/ 2-node fault tolerance - 45TB usable
- CentOS 7
- CUDA 9.2/10.0
- PGI 16.10
- Intel ICC 16
- gcc 4.8
- gcc 5.3 via 'scl enable devtoolset-4 bash'
To request access please fill out this form. (Use the link on the confirmation page to sign up for a new account. The same link is also included in the confirmation email.)
Instructions for running Jupyter Notebooks on compute nodes
- nano (18.104.22.168) is the head node of the cluster, it should not be used for any computations!
- to connect to the cluster, ssh firstname.lastname@example.org
- to get access to a particular node for interactive use, use qsub, e.g.,
- to get one GPU and one CPU core on node 7 for 1 hour for interactive use:
- to get entire node 1 for 1 hour for exclusive interactive use:
- better yet, do not allocate nodes for interactive use, instead just submit batch jobs, see for example Job Scripts section at https://kb.iu.edu/d/avmy for details. This is a much better way to share computing resources.
- interactive jobs are limited to 12 hours maximum walltime per job.
- batch jobs are limited to 96 hours
- submit request to staff for longer batch jobs (up to 240 hours)
- to see what’s running on the cluster, just run qstat
- this is a shared resource, please keep in mind that other users are using it as well; do not take over the system beyond what you really need.
- home directory is cross-mounted and accessible from all nodes
- Current System Status: https://nano.ncsa.illinois.edu:3000/d/3QVrDIFmz/nano-status