How to run pip requirements.txt on Docker if there is a change?

Pip Install Requirements Txt On Docker If There Is A Change

Building Docker images often involves installing Python packages from a requirements.txt file. However, pip installing all packages from scratch every time can considerably slow down Docker builds. In this comprehensive guide, you’ll learn techniques to cache pip installs in your Dockerfile so requirements.txt is only re-run when necessary.

Also read: Python on Docker: How to Host a Python Application in a Docker Container?

The Issue: Docker Builds Reinstalling Requirements Unnecessarily

Consider this simple Dockerfile:

FROM python:3.11
COPY requirements.txt /app/
RUN pip3 install -r requirements.txt 

On the first run, this will:

  1. Start from the python:3.11 image
  2. Copy the requirements.txt file into the /app directory
  3. Switch the working directory to /app
  4. Pip install all packages in requirements.txt

However, pip will reinstall all packages from scratch on subsequent builds even if requirements.txt hasn’t changed! This leads to slow Docker builds as all packages get redownloaded and reinstalled unnecessarily.

As an example, here’s a sample output showing the pip install step re-running even though requirements.txt hasn’t changed:

Step 4/4 : RUN pip3 install -r requirements.txt
 ---> Running in 2346e4ffb13b
Collecting pandas
  Downloading pandas-1.1.5-cp37-cp37m-manylinux1_x86_64.whl (10.4 MB)
Installing collected packages: pandas
Successfully installed pandas-1.1.5

So how do we avoid this?

The Solution: Cache Based on Requirements File Change

The key is to leverage Docker’s build cache to avoid repeat installs. The cache causes a step to be skipped if the files it depends on haven’t changed since the last run.

We can achieve this by:

  1. Copying only requirements.txt in one step
  2. Pip installing in the next step

For example:

FROM python:3.11  

# Step 1 - Copy only requirements.txt
COPY requirements.txt /app

# Step 2 - Install dependencies
WORKDIR /app  
RUN pip3 install -r requirements.txt

Now the pip3 install step will only run when requirements.txt changes in between builds!

The install will be cached as long as requirements.txt remains unchanged. Let’s see this in action:

Step 1 : COPY requirements.txt /app
 ---> Using cache 
 ---> 1476af48ef75

Step 2 : WORKDIR /app
 ---> Using cache
 ---> 175de77a703e
Step 3 : RUN pip3 install -r requirements.txt     
 ---> Using cache
 ---> 0f2868109b56

Much better! Pip now skips installs if the hashed requirements file is identical to the previous build.

Also read: Downgrade From Python 3.7 to 3.6 on Windows, MacOS, and Linux

Copying More Efficiently with Multiple Requirements Files

Large projects often split requirements into multiple files – for example:

  • requirements.txt – Common dependencies
  • requirements-dev.txt – Extra packages for development
  • requirements-test.txt – Packages only needed for testing

A naive way to handle this would be:

FROM python:3.11

COPY requirements.txt requirements-dev.txt requirements-test.txt /app/

RUN pip3 install -r requirements.txt \
                -r requirements-dev.txt \  
                -r requirements-test.txt

The problem is any change to any requirements file would invalidate caching for the pip install step.

We can do better by copying each file in its step:

FROM python:3.11

# Copy main requirements
COPY requirements.txt /app
RUN pip3 install -r requirements.txt

# Copy dev requirements 
COPY requirements-dev.txt /app  
RUN pip3 install -r requirements-dev.txt

# Copy test requirements
COPY requirements-test.txt /app
RUN pip3 install -r requirements-test.txt 

Now pip installs are cached individually per requirements file. For example, changing only requirements-dev.txt won’t invalidate caching for the main install.

The general rule is:

A step will use cache ONLY if all the files used in that step and previous steps haven’t changed since the last build

Copying the Whole Folder to Docker Before Running Pip Install

Sometimes we do want to invalidate caching globally.

For example, copying over the entire codebase into an image:

COPY . /app
RUN pip3 install -r requirements.txt

Any code change will correctly trigger the reinstalling of packages. So there is a tradeoff here depending on the use case.


Using Docker caching properly is vital for performant Docker builds and rapid iterations. Use separate copy and install steps per requirements file to maximize cache hits. Rebuild the entire image when the code changes to get fresh dependencies. Apply these techniques, and you’ll get pip installs blazing fast!