Some tips-and-tricks for creating a Dockerfile for the submission to this challenge include Dockerfile Reference and best practices.
Pick a suitable base image for your container. Search on Docker Hub for your preferred operating system, for example CentOS or Ubuntu. Optionally, specify a version tag (eg centos:7), because the :latest tag (default when not specifying a tag) might change between your submission and our evaluation.
You can also search on Google or Github for base images that already contain the default tools you need. There is a high chance that your favourite neuroimaging tools have already been containerized.
If you want to use a GPU, be sure to pick a nvidia/cuda base image. Have a look at all the available tags to find your favourite OS.
Limit the number of RUN commands, by combining them into a single RUN. Each command in a Dockerfile creates a layer in your container, increasing the total container size. So in stead of this:
RUN apt-get update
RUN apt-get install -y aufs-tools
RUN apt-get install -y automake
RUN apt-get install -y build-essential
RUN apt-get install -y libsqlite3-dev
RUN apt-get install -y s3cmd=1.1.*
RUN rm -rf /var/lib/apt/lists/*
RUN apt-get update && apt-get install -y \
&& rm -rf /var/lib/apt/lists/*
Be sure to do the clean-up within the same command! Otherwise the clean-up happens in a new layer and does not have any effect on the previous layers.
If your base image contains a non-default ENTRYPOINT (or USER, CMD, WORKDIR), you can reset this with:
For example, the jupyter docker-stacks with python base images start a notebook server, which is probably not needed for this challenge.