Skip to content

Storage Space on the Cluster

Storage on the Cluster

Once you have your account set up on the ORC clusters, you will have access to the following storage options.

Home

/home/UserID
The /home directory is where you land when you first start a shell session on the cluster. From /home, you can navigate to the other spaces, including /scratch and /projects.

Properties of /home:

  • Read/write on the login/head and compute nodes
  • Limited to 60 GB per user. You will get an email warning to clear space once you approach this limit.
  • Backed up.

SCRATCH

/scratch/UserID

Properties of /scratch:

  • Read/write on every node - jobs can write output here
  • Scratch directories have no space limit
  • Temporary: Data in /scratch gets purged 90 days from the date of creation, so make sure to move your files to a safe place before the cycle ends.
  • /scratch is not backed up.

PROJECTS

/projects/project-owner
This is additional persistent free storage for projects whose storage requirements exceed the 60 GB that is available in the /home directory. Faculty can request up to 1 TB of disk space if needed. Students, Postdocs and collaborators should request access to available projects spaces via their PI/advisor/supervisor. To request a projects directory, the faculty (PI) should send an email to orchelp@gmu.edu asking for the projects directory to be created. They should also provide a (user ID) list of those that should be granted access to the /projects directory. If students/other group members need to be added to an existing /projects space, the PI should send an email to orchelp@gmu.edu asking that hte new members be added to the direvtory.

The /projects space is

  • On ARGO, read-write on all nodes.
  • On HOPPER, read-write on all the nodes.
  • Not backed up.

All these are free storage resources on the ORC clusters. If you need more storage space on the cluster, please send an email to orchelp@gmu.edu to discuss available paid storage options such was /groups space described below..

GROUPS

/groups/group-name
This is storage that can be purchased on our MEMORI system by faculty members.They will need to provide an ORG code to charge the cost to and sign an SLA.

The current rate is $50/TB/year until 06/30/2024. The new rate starting FY24 (07/01/2024 - ) is $60/TB/year and it is valid for 5 years.

  • Storage is provided at 1 TB increments and can not be purchased in smaller chunks. Also, the cost can nto be pro-rated regardless of the time of year it was purchased.
  • The rate of $60/TB/year is a yearly rate and the storage cost will be charged every year. Please budget for each year.
  • Facilities & Administrative (F&A) rates will be applied to the storage costs for all proposal storage awards. Please include the F&A cost in the budget.
  • F&A charges are not levied if “Indirect” funds are used to pay the storage cost.
  • The storage space is to be used to store and share research data.
  • ORC does not backup the data. We use data protection schemes like replication or erasure coding to prevent data loss due to hardware failure, but can not recover data that was intentionally or unintentially removed or changed.

Monitoring Storage Usage

When the storage space is exceeded, you will likely receive a 'Disk quota exceeded' error and you will not be able to write to the cluster. To avoid exceeding your /projects or /home quota limit, you can track your usage. You can quickly get storage disk usage for directory $DIR using the following options, in order decreasing speed

gdu (fast)

gdu --si -s $DIR

ncdu

ncdu --si $DIR

du (slow)

du --si -s $DIR

You will be unable save anything to your /projects or /home directory until you make space by

  • removing unnecessary files from /projects or /home
  • compressing files in /projects or /home
  • requesting and purchasing /groups storage at a cost of $50/TB/year ($60/TB/year starting FY24) by emailing orchelp@gmu.edu
  • moving files to your /scratch directory. Please note that /scratch is not backed up and it is subject to a 90-day purge policy.