Running Jupyter Notebook inside a Docker container is a powerful and efficient way to manage your development environment. Whether you’re a data scientist, machine learning engineer, or Python developer, Docker provides a consistent, reproducible, and isolated environment that simplifies dependency management and avoids conflicts with other local installations. In this guide, you’ll learn how to run Jupyter Notebook in a Docker container step-by-step, including how to build your own image, use existing images, mount volumes, expose ports, and persist your work.
Why Run Jupyter Notebook in a Docker Container?
Before diving into the setup, it’s important to understand the key benefits of running Jupyter Notebook in Docker:
- Isolation: Run multiple environments with different Python or library versions without conflicts.
- Reproducibility: Share containers with others and ensure they run in exactly the same way.
- Portability: Develop on your local machine and deploy to the cloud with ease.
- Version Control: Dockerfiles can be version-controlled, ensuring reproducible environments.
- Lightweight: Containers are more resource-efficient compared to full virtual machines.
Prerequisites
Before you get started, make sure you have the following installed:
- Docker: Install Docker Desktop from https://www.docker.com/products/docker-desktop/
- Basic Terminal or Command Line Knowledge: Familiarity with commands like
docker build
,docker run
, anddocker ps
.
Method 1: Run Jupyter Using the Official Docker Image
The simplest and fastest way to get Jupyter up and running in Docker is by using the official pre-built image maintained by Project Jupyter. These images come pre-configured for various use cases.
Pull the Base Docker Image
Begin by pulling the base image:
docker pull jupyter/base-notebook
You can also explore other variants like scipy-notebook
or datascience-notebook
depending on your requirements.
Run the Jupyter Notebook Container
To run the container and expose the notebook on your local port 8888:
docker run -it --rm -p 8888:8888 jupyter/base-notebook
After the container launches, it will display a tokenized URL in the logs. Copy and paste it into your browser to open Jupyter.
Persist Your Notebooks with Volume Mounting
By default, data inside the container is not saved after the container stops. To persist notebooks:
docker run -p 8888:8888 -v $(pwd):/home/jovyan/work jupyter/base-notebook
This maps your current local directory to the container’s working directory.
Customize Access: Remove Token Authentication
To disable the token (use with caution in secure environments):
docker run -p 8888:8888 jupyter/base-notebook start-notebook.sh --NotebookApp.token='' --NotebookApp.password=''
This provides quick access without authentication.
Method 2: Build Your Own Docker Image
For advanced projects that require additional libraries or custom configurations, building your own Docker image gives you total control over your Jupyter environment.
Write a Dockerfile with Custom Libraries
Create a file named Dockerfile
and add:
FROM jupyter/base-notebook
RUN pip install scikit-learn matplotlib xgboost pandas
This example starts from the base image and installs additional Python packages.
Build the Docker Image
To build the image from your custom Dockerfile:
docker build -t my-custom-jupyter .
This creates an image tagged as my-custom-jupyter
.
Launch Your Custom Jupyter Container
Run your image with volume mapping for persistence:
docker run -p 8888:8888 -v $(pwd):/home/jovyan/work my-custom-jupyter
This ensures that your notebooks and datasets remain on your local machine.
Extend with Conda or requirements.txt
You can further customize your image by adding a requirements.txt
or Conda environment YAML file. Just copy the file and include an install step in the Dockerfile.
Method 3: Use Docker Compose for Multi-Service Workflows
Docker Compose lets you manage multi-container setups with ease — perfect for combining Jupyter with databases or APIs.
Create a Docker Compose File
Start by creating a file named docker-compose.yml
:
version: '3.8'
services:
jupyter:
image: jupyter/scipy-notebook
ports:
- "8888:8888"
volumes:
- ./notebooks:/home/jovyan/work
environment:
- JUPYTER_ENABLE_LAB=yes
This configuration runs the scipy-notebook
image and mounts the ./notebooks
folder from your host.
Start All Containers with One Command
Run the following in the directory where your docker-compose.yml
file is located:
docker-compose up
This starts the Jupyter Notebook server and any other services you define.
Add a PostgreSQL Service
To connect your notebooks to a database, add this block to your docker-compose.yml
:
db:
image: postgres:15
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: mydb
ports:
- "5432:5432"
Your notebook can now connect to the database via the hostname db
.
Benefits of Using Compose
- Unified configuration: Easily manage services and shared networks.
- Scalable: Add services like Redis, Kafka, or Flask APIs without complex scripts.
- CI/CD ready: Use in pipelines for reproducible test environments.
Common Configuration Options
When running Jupyter Notebook in Docker, you might want to customize the behavior further.
Change the Default Port
If port 8888 is in use, you can bind another port:
bashCopyEditdocker run -p 9999:8888 jupyter/base-notebook
Then open http://localhost:9999
.
Disable Token Authentication (For Local Use Only)
bashCopyEditdocker run -p 8888:8888 jupyter/base-notebook start-notebook.sh --NotebookApp.token=''
Warning: This disables all security—only use it in safe, offline environments.
Add a Custom Requirements File
If you have a requirements.txt
:
DockerfileCopyEditFROM jupyter/base-notebook
COPY requirements.txt .
RUN pip install -r requirements.txt
This installs all Python packages listed in your file.
How to Stop and Clean Up Containers
When you’re done working:
- Stop a container: Use
Ctrl+C
in the terminal or rundocker stop <container_id>
- Remove a container:
docker rm <container_id>
- List containers:
docker ps
anddocker ps -a
- List images:
docker images
- Remove an image:
docker rmi my-jupyter-notebook
Running Jupyter Notebook with GPU (Advanced)
If you’re using machine learning models that benefit from GPU acceleration and have NVIDIA Docker installed, you can run:
bashCopyEditdocker run --gpus all -p 8888:8888 my-jupyter-notebook
Make sure your image includes CUDA-compatible versions of PyTorch or TensorFlow.
Hosting Jupyter Notebook in the Cloud via Docker
You can deploy your container to cloud providers like AWS, GCP, or Azure using services like:
- AWS ECS or EKS
- Google Cloud Run or Kubernetes Engine
- Azure Container Instances
Just package your notebook environment in a Docker image, push it to a container registry, and deploy.
Best Practices
- Use
.dockerignore
to exclude unnecessary files. - Use named volumes instead of bind mounts for long-term storage.
- Keep Docker images lean by starting from minimal base images.
- Tag your images with versions to avoid accidental overwrites.
- Monitor container resource usage with
docker stats
.
Troubleshooting Tips
Issue: Cannot connect to localhost:8888
Solution: Check the container logs and confirm the port is exposed.
Issue: Changes disappear after container restarts
Solution: Use -v $(pwd):/home/jovyan/work
to persist data.
Issue: Permissions denied on mounted volume
Solution: Ensure your host files are accessible by the jovyan
user in the container.
Issue: Port already in use
Solution: Use a different port like -p 8899:8888
.
Conclusion
Running Jupyter Notebook in a Docker container is an efficient way to manage your development environment, especially for data science and machine learning projects. It offers isolation, portability, and repeatability across systems and teams. Whether you’re using the official image, customizing your own, or orchestrating services with Docker Compose, this setup will help you stay productive and organized. Mastering Docker with Jupyter ensures that your work is clean, scalable, and ready for deployment anywhere.