marcc-hpc/pytorch:latest
$ singularity pull shub://marcc-hpc/pytorch:latest
Singularity Recipe
Bootstrap: docker
From: marcchpc/pytorch_cuda9
%environment
# use bash as default shell
SHELL=/bin/bash
export SHELL
# add CUDA paths
CPATH="/usr/local/cuda/include:$CPATH"
PATH="/usr/local/cuda/bin:$PATH"
LD_LIBRARY_PATH="/usr/local/cuda/lib64:$LD_LIBRARY_PATH"
CUDA_HOME="/usr/local/cuda"
export CPATH PATH LD_LIBRARY_PATH CUDA_HOME
# make conda accessible
PATH=/opt/conda/envs/pytorch-py3.6/bin:$PATH
export PATH
%setup
# runs on host - the path to the image is $SINGULARITY_ROOTFS
%post
# post-setup script
# load environment variables
. /environment
# make environment file executable
chmod +x /environment
# default mount paths, files
mkdir /scratch /data /work-zfs
touch /usr/bin/nvidia-smi
# user requests (contact marcc-help@marcc.jhu.edu)
/opt/conda/bin/conda install opencv scikit-learn scikit-image scipy pandas
/opt/conda/bin/conda install -c anaconda numpy pytest flake8 tensorflow-tensorboard
/opt/conda/bin/conda install -c conda-forge tensorboardx tqdm protobuf onnx spectrum nibabel
# try a pip install
/opt/conda/bin/pip install torchtext
/opt/conda/bin/pip install pretrainedmodels
%runscript
# executes with the singularity run command
# delete this section to use existing docker ENTRYPOINT command
%test
# test that script is a success
Collection
- Name: marcc-hpc/pytorch
- License: MIT License
View on Datalad
Metrics
key | value |
---|---|
id | /containers/marcc-hpc-pytorch-latest |
collection name | marcc-hpc/pytorch |
branch | 0.5.0 |
tag | latest |
commit | 902a887b1ac715cbac1d11debf8cd82984d73a6a |
version (container hash) | 4015e08c7c652c85c811f40024548087 |
build date | 2020-05-21T09:44:18.123Z |
size (MB) | 9028 |
size (bytes) | 4229750815 |
SIF | Download URL (please use pull with shub://) |
Datalad URL | View on Datalad |
Singularity Recipe | Singularity Recipe on Datalad |
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.