- Uncomment the "runArgs" section in
.devcontainer/devcontainer.json
. - Uncomment the lines for the
nvidia
channel and thepytorch-cuda
package in theenv.yml
file. - Install Nvidia Container Toolkit and configure Docker by following this guide. You only need to do the steps under "Installing with Apt" and "Configuring Docker" without the "Rootless mode" section.
⚠️ Windows users: First verify you can donvidia-smi
in WSL. Then please install and configure Nvidia Container Toolkit in WSL.⚠️ Windows users: For step 2 under "Configuring Docker", instead of doingsudo systemctl restart docker
dosudo service docker restart
.
You can proceed with the installation of the devcontainer. Once the installation is complete, you can check the access to GPUs by executing in the devcontainer terminal:
nvidia-smi
You should see GPU stats.
And check PyTorch CUDA runtime is installed in Python console:
import torch
torch.cuda.is_available() # should return True