Skip to content

Data Transfer Services

Transferring data in and out of CHPC resources is a critical portion of many science workflows. Science workflows may involve the transfer of a few small text files between collaborators or sites, or may involve transfers of multiple terabytes or petabytes of data. The CHPC offers a number of options for moving data to and from CHPC storage resources. In some cases, the data may not need to be moved, as there are options to mount some CHPC file systems from local resources.

A good resource for information on data transfer considerations is the ESnet Faster Data site. Specifically, for setting expectations regarding transfer times, see the page about expected time to transfer data. For understanding the impacts of dropped packets on network protocols and the corresponding impacts to large science transfers, see the page regarding tcp for long range data transfers.

  Large transfers are very dependent on the characteristics of the resources on both ends of the transfer. If you need assistance in initiating, monitoring, or troubleshooting large transfers, you can reach out to CHPC via helpdesk@chpc.utah.edu.

On this page, we provide a description and usage information for four common scenarios:

On this page

The table of contents requires JavaScript to load.

 

Data Transfers for Small Data Sets

In some scenarios, a research workflow requires the movement of a few small input files to a computational cluster and/or a few small output files back to a local desktop for storage, final reporting and/or analysis.   For this case, CHPC suggests four options:

Graphical user interface that makes transfering small data sets from your local computer to your chpc home directory and vice versa easy

WinSCP (Windows), CyberDuck (Macs)

The use of simple graphical tools to move data to/from a local machine. In these cases you can "drag and drop" files from one system to the other

Authenticated File Transfers (scp, rsnyc)

Authenticated file transfer tools like SCP and rsync offer the significant advantage of secure data transmission through encryption and the necessity of user authentication. This ensures that data remains confidential during transfer and that only authorized individuals can access or modify files at the destination.

Unauthenticated File Transfers (wget, curl)

Unauthenticated file transfer tools like Wget and curl excel at efficiently downloading publicly available data directly from web servers without requiring logins or credentials. This makes them ideal for scripting automated downloads and retrieving resources that are intended for open access.

 

Data Transfers for Large Data Sets

For science workflows that transfer very large files and/or very large data sets between national labs, industry partners, or peer institutions, users require more advanced parallel transfer tools running on tuned endpoint devices such as Data Transfer Nodes (DTNs).  CHPC supports various parallel transfer tools that support the known science workflows at the University of Utah.  If a research group requires the support of an additional transfer tool, the group may request it through helpdesk@chpc.utah.edu.

Data Transfer Nodes

Campus network traffic usually routes through a firewall, sufficient for small transfers. However, research computing requires high bandwidth and connections, overwhelming the firewall and impacting general campus use. To address this, a Science DMZ was created, a separate network with optimized performance and security. CHPC provides Data Transfer Nodes (DTNs) utilizing this DMZ for efficient research data transfers.

For more information visit "What makes a Data Transfer Node?".

 DTNs can be used with standard cli or gui data transfer tools (e.g., rsync, wget, WinSCP, etc.) to move data to and from CHPC resources. These nodes are also set up to leverage all the parallel transfer tools that CHPC supports. All CHPC DTNs are also available as Globus Endpoint Clusters, which allows for concurrent and fault-tolerant transfer of large data sets with lots of files (see Globus section below for CHPC endpoints). DTN's could also be used to transfer data to and from the cloud. 

The CHPC also supports additional specialized  tools for moving data to/from cloud storage.  Some of these tools are specific to a single cloud storage provider (such as s3cmd for Amazon cloud services), whereas others, such as rclone, work with different cloud storage providers.

The CHPC also supports a number of group-owned DTNs.  If you need any information about an existing group owned DTN or are interested in having a DTN dedicated for your group, please contact us through helpdesk@chpc.utah.edu.

Both internal and external DTNs can be used via Slurm on notchpeak (General Environment) and redwood (Protected Environment).

 

General Environment DTNs

General Environment DTNs are divided into internal (intdtn) and external (dtn) groups. Internal DTNs offer 40 Gbps connectivity within the campus firewall, ideal for on-campus transfers. External DTNs provide 100 Gbps connectivity to the Science DMZ, better suited for off-campus transfers and large datasets. 

CHPC General Environment DTNs:

  • intdtn01.chpc.utah.edu
  • intdtn02.chpc.utah.edu
  • intdtn03.chpc.utah.edu
  • intdtn04.chpc.utah.edu
  • dtn05.chpc.utah.edu
  • dtn06.chpc.utah.edu
  • dtn07.chpc.utah.edu
  • dtn08.chpc.utah.edu

Protected Environment DTNs

The CHPC's Protected Environment (PE) DTNs operate at 40 Gbps and require the University of Utah VPN for off-campus access. While offering the same data transfer tools as the General Environment, the PE DTNs have stricter security for remote connections. Tools like screen and tmux allow background transfers to continue even if the SSH connection is lost.

CHPC Protected Environment DTNs:

  • pe-dtn03.chpc.utah.edu
  • pe-dtn04.chpc.utah.edu
  • pe-dtn.chpc.utah.edu (Can be used to connect to the servers with round-robin balancing)

 

See Data Transfer Node Access via Slurm for details

Parallel Transfer Tools

See our Software page for more information on CHPC installations of these tools.  These can be used on the Data Transfer Nodes mentioned in the above section (recommended for any large file transfers), or from the other CHPC maintained resources that have access to the CHPC applications. 

 

Direct Mounts of CHPC File Systems

  The CHPC doesn't offer direct mounts of PE file servers outside of the PE. We recommend that users consider Globus, SFTP (FileZilla), SCP (WinSCP), rclone, and the PE Data Transfer Nodes.

For usage situations where you do not need a second copy on a local machine, but instead only need to access a file, you can do so by having the CHPC file system in which the file is located mounted on your local machine. CHPC allows mounts of home directories as well as group owned file systems on local machines. In addition we allow mounts of other group owned file systems; however, we do not allow mounts of the general CHPC scratch file systems such as /scratch/general/nfs1 and /scratch/general/vast. If the local machine is off campus, you must access via the University of Utah VPN. Below is information on how to do the mounts for home and group spaces on Windows, Mac and Linux systems. Note that there is also a short training video that covers this topic. In all of the following you must replace <uNID>  with your unid. 

 

Home Directories 

The information needed to mount your home directory space exists in the Account Creation Notification email sent to new users. Alternately, you can also get this information when on a CHPC resource, by issuing a df or df | grep command. As an example, if the df command returns eth.vast.chpc.utah.edu:/home/XX/, then using the value of XX and your unid you would map the network drive using the following paths (NOTE - when authenticating, your username is ad\):

  • On Windows: \\samba.chpc.utah.edu\XX-home\<uNID>
  • On a Mac:  smb://samba.chpc.utah.edu/XX-home/<uNID>  
  • On Linux: If you have root, you can cifs mount CHPC file spaces by (you will be prompted for your password): mount.cifs //samba.chpc.utah.edu/XX-home/<uNID>  /mnt -o domain=AD,user=<uNID>
Group Directories

When the PI obtains group space, CHPC provides the name of the space along with the information on mounting and using the group spaces. (NOTE - when authenticating, your username is ad\):

  • On Windows: \\samba.chpc.utah.edu\<name-groupnumber>
  • On a MAC:  smb://samba.chpc.utah.edu/<name-groupnumber>
  • On Linux: If you have root, you can cifs mount  CHPC  file spaces by (you will be prompted for your password): mount.cifs //samba.chpc.utah.edu/<name-groupnumber>  /mnt -o domain=AD,user=<uNID>

 

Temporary Guest Transfer Service

CHPC provides a mechanism for our users to transfer files to and/or from individuals without CHPC accounts. This service is called guest-transfer.

What is it for?

  • At times, CHPC users need to transfer files to or from individuals that don't have CHPC accounts. These files are often quite large (many gigabytes), and thus unsuitable for other transport mechanisms (email, DVD).
  • These file transfers often need to happen with little or no warning. They may also need to occur outside CHPC's support hours. Thus, the guest-transfer service must function without intervention or assistance from CHPC staff.

What is it not for?

  • The guest transfer service is not for repeated events.
  • The guest transfer service is not for long-term data storage.
  • The guest transfer service is not for restricted (PHI/HIPAA/HITECH/FERPA) or sensitive data.
  • The guest transfer service is not for huge data transfers (it's currently restricted to approximately 5 terabytes).

How do I get a guest account?

  • When you need to use the guest transfer service, visit  https://guest-transfer.chpc.utah.edu/ and fill out the form. This form creates a guest transfer account. You then give the guest account username and password to your colleague. You and your colleague can now share files.

How do I use the service?

  • Once you have created a guest-transfer account and given it to your colleague, you and your colleague can copy your files to guest-transfer.chpc.utah.edu with your scp/sftp client (scp, sftp, WinSCP, etc). Files should be transferred to /scratch on guest-transfer.chpc.utah.edu.
  • The CHPC user can then transfer the files from /scratch on guest-transfer.chpc.utah.edu to other CHPC resources.

Things to remember:

  • The process is completely automatic. You fill out the form, it immediately gives you a guest account.
  • Only CHPC users can create accounts.
  • The person who creates the guest account is responsible for all activity of the guest account.
  • This guest account is only usable for the guest transfer service. It provides no access to any other CHPC or University resources.
  • Files are transferred via scp/sftp. Interactive logins are disabled.
  • Files are automatically removed after 90 days (based on last-access time).
  • Files in the guest-transfer service can be read or changed by any other user.
  • Consider using encryption to protect and verify your files.
  • DO NOT USE THIS SERVICE FOR SENSITIVE OR RESTRICTED DATA!

 

Last Updated: 5/15/25