Managing multiple Jupyter Notebook projects often means juggling different Python libraries and dependencies. Instead of setting up a separate virtual environment and installing Jupyter for each project, I use Docker to streamline the process.
We can use Conda as well, and I will show how, but presonally I prefer Docker.
Create an isolated evivronment for each project
I had a previous post on how to setup virtual environments using pipevn, and virtualenv.
And I had this post about conda.
But since that time, I switched to use mostly conda or docker specially for Jupyter notebooks.
Conda offers managing virtual environments, and if you want a tool just to create virtual environments, and install packages, then in my opinion, Conda is the best between others uv, venv, virtualenv, pipenv …etc.
Docker offers the same level of environment isolation as virtual environments, but with several added benefits:
- Pre-packaged with all required dependencies.
- Clean, reproducible environments.
- No need to install Jupyter separately for every project.
Docker Compose Setup
Here’s a minimal docker-compose.yml
file I use. It pulls a base Jupyter image and sets up shared volumes for your notebooks and environment variables.
1version: "3.8"
2
3services:
4 jupyter:
5 image: jupyter/minimal-notebook
6 container_name: jupyter_tutorial
7 ports:
8 - "8888:8888"
9 volumes:
10 - ./notebooks:/home/jovyan/work
11 - ./.env:/home/jovyan/.env
12 environment:
13 - DOTENV=/home/jovyan/.env
14 command: start-notebook.sh --NotebookApp.token=''
What about .env file:
I include a .env file in the root directory to securely store my API keys for services like GPT, Claude, and others.
Project Structure
Your project directory might look something like this:
1my-jupyter-project/
2├── docker-compose.yml
3├── .env
4└── notebooks/
5 └── your_project.ipynb
Conda Template Environment
Conda provide different ways to create an environment with base packages. You can create a base environment and then clone it, or you can create a Template Environment, as follows:
1. Create the template environment
1conda create -n jupyter-template python=3.11 jupyterlab notebook ipykernel
2. export the environment to yaml file
1conda activate jupyter-template
2conda env export --no-builds > jupyter-template.yaml
3. Use the template to create a new environment
1conda create -n my-new-project --file jupyter-template.yaml
Or Create the template YAML manually
If you want to create the file manually, there it is:
1name: jupyter-base
2channels:
3 - defaults
4dependencies:
5 - python=3.11
6 - notebook
7 - jupyterlab
8 - ipykernel
and then you can use it as follows:
1conda env create -f jupyter-template.yaml -n my-new-env
Another alternative: Python docker:
Another alternative, is just using python docker image/container, and create a new container for each project:
1docker run -it --rm python:3.11 bash
Other virtual environment tools:
Although there are other virtual environment tools, like uv, venv, virtualenv, poetry.
Some of them are more for packaging production code, like poetry, and some are mimicking npm like
pipenv, and some of them are fast like uv. But personally I think Conda (specially miniconda) is the easiest to use.
You can get more here.
Conclusion
With this setup, you can quickly spin up isolated environments for your Jupyter projects without redundant installs or configurations.