Accessing Deep Learning Containers

Software on HPC is accessed via environment modules or containers.

  • Software Module:
  • Container:
    • Containers bundle an application, the libraries and other executables it may need, and even the data used with the application into portable, self-contained files called images.
    • Containers simplify installation and management of software with complex dependencies and can also be used to package workflows.

Visit our documentation on HPC software modules and containers for more information.

Access through Open OnDemand

  • Click on the kernel to open a Jupyter Notebook.
  • Packages from the selected kernel will be available for use in the notebook.

Run DL Script on the Command Line

Use the PyTorch container:

module load apptainer pytorch 
apptainer run --nv $CONTAINERDIR/pytorch-2.0.1.sif file_name.py

Use the TensorFlow/Keras container:

module load apptainer tensorflow 
apptainer run --nv $CONTAINERDIR/tensorflow-2.13.0.sif file_name.py
  • The --nv flag tells the command to use the GPU
  • The default command defined in each container is python so using run basically executes python file_name.py

Check Your Knowledge

  1. Log in to Rivanna using the Interactive partition using the following parameters.
  • 2 hours
  • 8 cores
  • Allocation: hpc_training
  • GPU: yes, 1
  1. Copy the folder project/hpc_training/dl_with_hpc to your home or scratch account using one of the following:
cp -r /project/hpc_training/dl_with_hpc ~/<...>
# OR
cp -r /project/hpc_training/dl_with_hpc /scratch/<ID>/<...>
  1. Convert example1.ipynb file to example1.py file (make sure you are in the dl_with_hpc folder)
jupyter nbconvert --to python example1.ipynb
  1. Run example1.py using the Pytorch container.
Next