nickjer/singularity-rstudio-spark:2.3.0-hadoop-2.7-r-3.4.3

$ singularity pull shub://nickjer/singularity-rstudio-spark:2.3.0-hadoop-2.7-r-3.4.3

Singularity Recipe

BootStrap: shub
From: nickjer/singularity-rstudio:3.4.3

%labels
  Maintainer Jeremy Nicklas
  Spark_Version 2.3.0
  Hadoop_Version 2.7

%help
  This will run Apache Spark with an RStudio Server base

%apprun spark-class
  exec spark-class "${@}"

%apprun spark-master
  exec spark-class "org.apache.spark.deploy.master.Master" "${@}"

%apprun spark-worker
  exec spark-class "org.apache.spark.deploy.worker.Worker" "${@}"

%runscript
  exec spark-class "${@}"

%environment
  export SPARK_HOME=/opt/spark
  export PATH=${SPARK_HOME}/bin:${PATH}

%post
  # Software versions
  export SPARK_VERSION=2.3.0
  export HADOOP_VERSION=2.7

  # Install Spark
  apt-get update
  apt-get install -y --no-install-recommends \
    openjdk-8-jre
  mkdir -p /opt/spark
  wget \
      --no-verbose \
      -O - \
      "http://mirror.cc.columbia.edu/pub/software/apache/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" \
    | tar xz --strip-components=1 -C /opt/spark

  # Install sparklyr: R interface for Apache Spark
  Rscript -e " \
    withCallingHandlers( \
      install.packages( \
        c('tidyverse','sparklyr'), \
        lib='/usr/local/lib/R/site-library', \
        repo = 'https://cran.rstudio.com/', \
        clean = TRUE \
      ), \
      warning = function(w) stop(w) \
    ) \
  "

  # Clean up
  rm -rf /var/lib/apt/lists/*

Collection


View on Datalad

Metrics

key value
id /containers/nickjer-singularity-rstudio-spark-2.3.0-hadoop-2.7-r-3.4.3
collection name nickjer/singularity-rstudio-spark
branch master
tag 2.3.0-hadoop-2.7-r-3.4.3
commit 4cdff7648a01da1348ddef22f3e2cf0f53e7c5a2
version (container hash) ee55e49269da8c56bf7c2390c8e93de3
build date 2018-03-11T21:43:45.238Z
size (MB) 1744
size (bytes) 687702047
SIF Download URL (please use pull with shub://)
Datalad URL View on Datalad
Singularity Recipe Singularity Recipe on Datalad
We cannot guarantee that all containers will still exist on GitHub.