Skip to content

Storage Services in CHPC's Protected Environment (PE)

In the PE, CHPC currently offers four different types of storage: home directories, project space, scratch file systems and a new archive storage system.  All storage types except for the archive storage system are accessible from all CHPC  PE resources. Mammoth (CHPC PE) file services are encrypted. Data on the new archive storage space must be moved to one of the other spaces in order to be accessible.  See the Data Transfer Services page for information on moving data to and from the CHPC PE storage.

***Remember that you should always have an additional copy or possibly multiple copies, on independent storage systems, for any crucial/critical data. While storage systems built with data resiliency mechanisms (such as RAID and erasure coding mentioned in the offerings listed below or other similar technologies)  allow for multiple component failures, they do not offer any protection for large scale hardware failures, software failures leading to corruption, or for accidental deletion or overwriting of data.  Please  take the necessary steps to protect your data to the level you deem necessary.***

Home Directories

CHPC provides all PE users with a default 50 GB/user home directory space on the mammoth storage. 

The size of a user's home directory space is enforced with a quota.  There is a soft quota of 50GB and a hard quota of 75GB.  Once a user's directory exceeds the soft quota, they have seven days to clean up and return to below the soft quota amount.  After 7 days, they will no longer be able to write in their home directory until they clean up so that they are under the soft quota.  If an user exceeds the hard quota, they immediately will no longer be able to write to their home directory until they clean up so they are not longer over this quota.

A more detailed description of the configuration of this storage offering is available (similar to the group space in the general environment)

This space is backed up, for details on the backup schedule see 3.1 File Storage Policies

 CHPC does not offer larger home directories in the PE.  Instead, users should make use of the project space to store data.

Project Level Storage File System

Also on the mammoth file system, CHPC provides each PE project with 250 GB of project space at no cost. Each project has its own project space. Due to the NIH Shared Instrument Grant ( NIH SIG) that allowed for the new PE, PIs with NIH supported projects can, at no cost,  request up to 5 TB/project, with justification of need.  Please note that this increased allocation of storage for NIH supported projects only applies to the storage purchased as part of the NIH SIG, and is only through the warranty lifetime of the hardware.

Update August 2021: As we approach the end of the NIH SIG and have purchased additional group space,moving forward NIH projects will no longer be able to request of to 5 TB project space at no cost.  All projects will be provided with 250 GB of project space at no cost.

Any PI can purchase additional project space beyond these levels at the price of $150/TB (price based on hardware cost) for the lifetime of the hardware which is purchased with a 5 year warranty.  CHPC will purchase additional project space as it is needed. Please contact us by emailing helpdesk@chpc.utah.edu  to discuss the storage needs of your PE project.

Each project has its own project space. The access to this space is controlled by extended access controlled lists (ACLs) such that only users that are part of the project are allowed access to the space. For IRB governed projects, the users given access must be listed on the IRB. For non-IRB governed projects, the PI of the project will be consulted before CHPC gives any user access to the space.

Effective in 2021: Due to the growth of the amount of PE group space storage, the backup schedule of the PE project spaces has been changed from the original weekly schedule.  Project spaces less than 5 TB in size  have a full  backup completed once a month with incremental backups weekly. Spaces 5 TB and larger have a full backup completed  once a calendar quarter with incremental backups weekly. Once a new full backup is completed, the previous period backup is deleted. For additional details on the current backup policy of the PE project space see 3.1 File Storage Policies

Effective 1 July 2022 - announced 10 February 2022: Due to the end of the NIH SIG funding window (April 2022) and the growth of the project space, CHPC will no longer be able to provide backup of the project spaces free of charge.  If you want to continue to have our project space backed up you will need to purchase the necessary amount of elm (PE archive space, described below) to support the backup.  THe amount of elm space needed is 2 x the size of the group space.  Currently, this takes the cost of the group space from $150/TB to $450/TB. If this is of interest,  please contact us at helpdesk@chpc.utah.edu to discuss and implement the backup plan. For details on the current backup policy of the PE project space see 3.1 File Storage Policies.

Scratch File Systems

The PE scratch file system is /scratch/general/pe-nfs1 with a capacity of 280 TB THIS FILE SYSTEMS IS NOT BACKED UP. This space is provided for users to store intermediate files required during the duration of a job on the PE HPC cluster redwood. On these scratch file system, files that have not been accessed for 60 days are automatically scrubbed. There is no charge for use of the scratch space.

Archive Storage (Elm)

The archive storage is the same ceph solution discussed in the general storage page.   With the  erasure coding configuration  being used, the price is $150/TB of usable capacity for the 5 year lifetime of the hardware.  As we currently do with the project space, we will operate this space in a condominium model by reselling this space in TB chunks.  This space is a stand alone entity, and is not mounted on other CHPC resources.

Backup Policy

The backup policies of each type of storage have been described above. 

Additional Information

For more information on CHPC Data policies, visit: File Storage Policies

Last Updated: 9/8/23