Storage Space on the Cluster
Storage on the Cluster
Once you have your account set up on the ORC clusters, you will have access to the following storage options.
Home
/home/UserID
/scratch
and /projects
.
Properties of /home
:
- Read/write on the login nodes
- On ARGO, /home is mounted read-only on the compute nodes - jobs cannot write to your home directory.
- On HOPPER, /home is mounted read-write on all nodes - jobs can write to your home directory.
- Limited to 50 GB per user. You will get an email warning to clear space once you approach this limit.
- Backed up.
SCRATCH
/scratch/UserID
Properties of /scratch
:
- Read/write on every node - jobs can write output here
- Scratch directories have no space limit
- Temporary: Data in
/scratch
gets purged 90 days from the date of creation, so make sure to move your files to a safe place before the cycle ends. /scratch
is not backed up.
PROJECTS
/projects/project-owner
/projects
space, the PI should send an email to orchelp@gmu.edu asking that hte new members be added to the direvtory.
The /projects
space is
- On ARGO, read-only on the compute nodes.
- On HOPPER, read-write on all the nodes.
- Not backed up.
All these are free storage resources on the ORC clusters. If you need more storage space on the cluster, please send an email to orchelp@gmu.edu
to discuss available paid storage options.