Introduction

FastAPI is a popular Python API framework that has a neat integration with the data validation library Pydantic. This makes manually validating user input a thing of the past. Although it is straightforward to get started locally, your first endpoint is just a small part of creating production-ready APIs. In this blog, I introduce an open source template for FastAPI, which is the result of building and deploying several FastAPI projects over the years.

You can find the code here.

Why use a Template

The template comes with the following integrations:

  • The API runs in a Docker container, making it very easy to get started and switch between projects, operating systems and machines.
  • It has a database integration built-in, which is not included in FastAPI by default (also see my FastAPI SQLAlchemy post).
  • The dependencies are managed by Poetry, a modern tool that also locks the dependencies for consistent builds (together with Docker).
  • A Makefile, acting as an ’entrypoint’ to the project, allowing one to install the dev setup with a single command. (I typically use the same entrypoints for all projects, enabling easier context switches.)
  • Workflow files to run in GitHub Actions, allowing integration tests to run both locally and in a CI pipeline.
  • Automatic linting to ensure consistency and catch trivial bugs.

A Combination of Docker & Make

By using Docker the development & deployment of the code is much more seamless, and has numerous advantages with some added complexity. Docker enables a lightweight reproducible environment for running your code & tests, as you can make your environment scriptable. The paradigm of ’test’ containers is particularly powerful, as they allow you to build a fleet of containers purely for running tests suites.

Dockerized test suites

In the template, we set up two containerized test suites: a unit and an integration test suite. Although this makes the tests very reproducible, although you might lose some iteration speed on your unit tests in particular. You can run your unit tests locally as well and use the containerized version as a sanity check as a middle ground to this challenge. Another advantage to the tests being containerized, is always being able to run all the test suites after checking out the project (using make test).

This is where Make comes in. Make allows us to forget about all the specific test arguments and focus on our goal: running the tests. (Similarly, you can apply this to other operations projects typically have, such as make up for starting the application stack.) By consistently using these Make targets, it becomes easy to switch between projects and onboard new developers. Onboarding can often be a tedious process when a specific set of tools with specific versions are required. This solution aims to resolve that pain.

CI/CD Pipelines with GitHub Actions

Another advantage of using Docker & Make arises when we look at building CI/CD pipelines. CI/CD pipelines have numerous advantages, like faster development cycles with fewer bugs. In the template we have already included a (simple) CI pipeline to run the tests in the project and lint the code.

Since make works out of the box, we don’t need additional dependencies in our CI pipeline. We often see CI/CD pipelines requiring specific shell scripting to get running, creating a lot of surface area for bugs. After all, you cannot properly run GitHub Actions locally (at the time of writing). As an added benefit, this avoids vendor lock-in with GitHub Actions, as the workflow file does not contain a lot of logic:

 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
jobs:
  test:
    name: Run the test suite
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v2

      - name: Run the tests
        run: make test

  checks:
    name: Check code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v2

      - name: Checks
        run: make check

Link to workflow

Getting Started

To get started, make sure you have the following requirements installed:

  • Docker
  • Make (available on most unix systems)

To get started, simply clone the project (replace my-project with a name of your choosing):

1
2
$ git clone https://github.com/BiteStreams/fastapi-template my-project
$ cd my-project

Optional: To install a local copy of the Python environment (to get code-completion in your editor for example), you can install the Poetry environment with poetry install (assuming the correct Python version and Poetry are available).

To get an overview of the available commands:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
$ make help
up:
 Run the application
done: check test
 Prepare for a commit
test: utest itest
 Run unit and integration tests
utest: cleantest
 Run unit tests
itest: cleantest
 Run integration tests
check: cleantest
 Check the code base
cleantest:
 Clean up test containers
migrations:
 Generate a migration using alembic
migrate:
 Run migrations upgrade using alembic
downgrade:
 Run migrations downgrade using alembic
help:
 Display this help message

Start up the API:

1
$ make up

Then update the schema:

1
$ make migrate

That’s it, now the app is running at localhost:5000/docs. Here, you should find the API docs: docs

Code changes are automatically detected using a Docker volume mount.

Testing FastAPI

You can run tests using the make utest and make itest commands. The tests are containerized and the Docker setup can be found in the .ci/ folder. They are written using Pytest. You can run the tests using:

1
$ make test

tests

This runs the integration & unit tests. If you want to run the unit and integration tests separately, use make itest to run the integration tests and make utest to run the unit tests. You will notice that make itest is significantly slower, as it also starts up a PostgreSQL database container for the tests.

Relational Database

Since most projects require a relational db, we have included PostgreSQL out of the box. It is starts automatically in a separate container when you start the application. There are Make targets for creating migrations using alembic, since database interactions are handled by SQLAlchemy, an industry standard. I have a blog post with 10 Tips for adding SQLAlchemy to FastAPI, that describes some implementation choices.

To migrate your database to the latest schema run:

1
$ make migrate

To create a new migration after you made some schema changes run, add a message to clarify what was changed:

1
$ make migrations m="Migration message"

Linting the code

To make sure that everyone writes code consistently formatted, linting tools are great. This allows a team (when coding and doing code reviews) to focus on what the code is doing, instead of whether there is an appropriate line break at the end of the file.

To lint (check) the code for linting issues:

1
$ make check

This checks the code using pre-commit in a container. You can configure all these tools in the pyproject.toml file in the root dir, and decide to install it locally as well for better performance.

FastAPI tests in GitHub Actions

Any serious project nowadays has CI (& CD) pipelines, and so does this one. The pipeline is implemented with simple GitHub Actions, as it re-uses the tools we introduced earlier. The CI pipeline runs whenever you push to the main branch, which you can change easily in the .github/workflows/code-integration.yml file. The CI pipeline runs make test and make check as you would do locally to ensure that the code is tested and linted.

Conclusion

Some of these integration, though simple, could be tedious to get right. It has helped teams I worked in get up and running in several projects and onboard people quickly. Using a Makefile even makes it FastAPI (and Python) dependent - I have used the same setup for a Golang project. Hopefully this template can help others starting their FastAPI projects as well.

This template was originally created for a presentation on unit-testing for SharkTech talks at Start Up Village Amsterdam.