Skip to content

Machine learning and artificial intelligence at the Center for High Performance Computing

The Center for High Performance Computing (CHPC) broadly supports computational research at the University of Utah. Our vision is to support the ever-increasing—and increasingly diverse—research computing and data needs of researchers at the university. To this end, we offer computational resources, data storage and transfer services, virtual machines, resources for the analysis of protected health information, and advanced training and user support. We also support hardware and software for machine learning and artificial intelligence applications, which are increasingly relevant to researchers from every discipline and with every level of experience performing computational research. Whether you're studying the structure and function of proteins, analyzing patterns in the formation of sea ice, or examining speech patterns in audio recordings, CHPC is here to help.


Software for machine learning

CHPC provides researchers with a number of common machine learning tools, including TensorFlow, Keras, PyTorch, and scikit-learn, through the "deeplearning" module. This bundle of common software provides a quick, easy way for researchers to explore machine learning—no software installation necessary. Additionally, researchers can install their own Python packages if they need access to other software. We also provide select popular large language models and tools for researchers to use.

Researchers can use the Open OnDemand web interface with the deeplearning module or self-installed packages to quickly gain access to CHPC's high-performance computing systems with the familiar interface of a Jupyter notebook.


GPUs

In addition to tens of thousands of CPU cores, CHPC's high-performance computing clusters include servers with GPUs, which can greatly improve training times and throughput for large machine learning applications. Additional information on GPU resources, including instructions for access, can be found on our GPU documentation.


Data storage and transfer

Machine learning and artificial intelligence are data-intensive by nature. Large data sets are right at home at CHPC; we store petabytes of research data and offer storage solutions—including options with backups—to researchers by the terabyte.

There are several options for moving data to and from CHPC resources. Among these are the Data Transfer Nodes (DTNs), which have been optimized to enable low-latency and high-speed data transfers.


User support and training

We are always here to help. The User Services team offers a series of presentations each semester to help researchers get started with CHPC resources. Presentations generally do not require registration and there is no cost to attend. We provide additional information, including the process for getting a CHPC account, on the Getting Started Guide.

If you have any questions, please don't hesitate to reach out. Our contact information is available on the Getting Help page.

Last Updated: 7/31/24