Inside our webapp developer environment¶
This document is appropriate for applications which do not have a
compose.sh file in the root of the repository.
The presence of
compose.sh indicates that an application is using our older development
environment. See the original guide if that file is
This document provides a more "behind the scenes" guide to how our webapp developer environment works and why it does the things which it does. Nothing in this guide is required if you just want to work with and use our developer environment but it may be interesting to those wanting to know more about the "how" and the "why" of our developer environment.
By contrast, the following documentation may be more useful if you have a specific goal or want to learn how to get started:
- The reference guide contains cut-and-paste command snippets to quickly accomplish common tasks.
- Our how-to section contains more task-focussed guides covering how to perform specific tasks within the environment. In particular you may want to read the following guides:
The guiding philosophy for the webapp developer environment can be summarised as follows:
- Provide a safe "paved path" to success:
- Defaults should be opinionated, reasonable, safe and avoid footguns when appropriate.
- Projects should include their dependencies in a reproducible way. When appropriate exact version specifiers and package hashes should be used to be able to ensure reproducible environments.
- CI should be seen as useful and providing a helpful safety net, not as some beast to be appeased.
- Be efficient:
- Moving from
git cloneto getting the application running should be quick and require minimal project-specific software installation.
- Our developers context switch a lot; it should be easy to move between products without having to install different versions of system software.
- The edit-test-debug cycle is fundamental to a developer's workflow. Make it fast.
- For developers with machines which use a CPU architecture which does not match production, for example Apple Silicon, we don't want to pay unnecessary performance penalties.
- Moving from
- Expect a diverse set of developers:
- Developers have a variety of machine platforms and Operating Systems. Accept this and use containerisation to harmonise the developer environment.
- We want the freedom for one project to upgrade their dependencies without affecting other projects or requiring global changes to a developer's system.
- Developers should have confidence that what works on their machines will work in production.
Tools and technologies¶
This section discusses the tools we use as part of our developer environment and a rationale for using them.
Docker and docker compose¶
Our guiding philosophy forms the core of the reasoning behind the heavy use of containerisation in our development environment. While it is certainly possible for code to be tested running natively on the developer's machine, when it comes to running the application itself there are a number of "sidecar" services which need to be running. These include the database and, for API projects, the API Gateway emulator proxy.
The docker and docker compose tools allows each of those sidecar services to be specified and started automatically as needed and connected to the web application. By using the docker healthcheck feature we can ensure that services are started in the correct order and wait for their dependencies to be ready.
Programming languages usually come with some method of installing packages "locally" to a project
node_modules directory. In the Python ecosystem a directory containing downloaded packages and
links to the Python interpreter intended to run them is called a "vitualenv". Historically it has
been the responsibility of the developer to manually manage virtualenvs.
The "poetry" tool is in charge of the following things:
- Maintaining a list of packages which are needed to run the application along with version specifiers.
- Generating list of the exact versions of packages which should be installed and cryptographic hashes of their content.
- Creating a virtualenv and installing packages into it.
poetry install command will create a virtualenv, fetch the required packages and install them
into the created virtualenv. This is the default behaviour and ensures that packages required for
one project will not conflict with those required for another. The virtualenv creation is
transparent to the developer.
Commands can be run inside the virtualenv via
poetry run. For example, the
pytest test running
can be run via
poetry run pytest.
Poetry maintains the
poetry.lock file which specifies the exact versions of packages used along
with a cryptographic hash of the expected package file. By pinning versions and recording expected
package hashes we can have confidence that the exact execution environment can be re-created from
from developer's machine to another. Again this is the default behaviour and is intended to make it
easy to "do the right thing" with respect to dependency version pinning.
Poetry providing opinionated and reasonable defaults surrounding developer environments and dependency management aligns with our guiding philosophy.
package.json file includes the concept of "scripts" which are
commands which are given short aliases. So, for example, running
npm test will cause the test
suite to be run and the exact command required to do this is specified in
We use a similar tool called poe which allows these tasks to be
pyproject.toml. For example the
up task is a short alias for
--profile development up. The use of poe tasks is intended as a convenience for developers meaning
that they need only recall the short
poe up command and do not need to recall the longer
A list of the standard poe tasks shipped with our standard webapp developer environment can be found in the reference guide.
Providing common tasks behind a short alias aligns with our guiding philosophy in that we're providing both opinionated defaults about how to run and manage the application along with a documented "paved path" for various tasks.
We ship a number of pre-commit checks in the standard environment which run various linting and auto-formatting tools over the source. These same checks are run in CI jobs.
This aligns with our philosophy by providing safety-nets for developers before they have even pushed their changes. By having tools which auto-format the code, we aid the developer in being efficient by not needing to manually edit their code to appease the linter.
Tox and pytest¶
The pytest framework is used to discover and run tests. Pytest is a mature and well-supported testing framework and has first class support in IDEs such as Visual Studio Code. Pytest's fixture support provides a flexible framework for separating common functionality for tests out so that the tests themselves can remain minimal.
Pytest also has support for plugins which provide code coverage information used by GitLab CI to provide feedback on Merge Requests, etc.
The tox utility provides features which overlap somewhat with poetry but which
also offers the ability to define standard "testing environments" which contain a defined set of
dependencies and commands required to run test suites. For our standard developer environment we
only provide one environment which installs dependencies via
poetry and runs tests via
Our CI jobs just run
tox inside the built container and so applications with more complex testing
requirements can add additional test environments to
tox.ini. For example, an API application may
ship with an OpenAPI schema specification and a test environment can be added to
installs a schema verification tool and runs it.
Where functionality lives¶
This section covers some key functionality and where it is configured.
Linting and code formatting¶
.pre-commit-config.yaml file defines a number of pre-commit checks which are run locally on
the developer's machine on commits or at any time via
poe fix. The same checks are run in CI. For
Python projects these checks include tools like
isort which automatically format source
files to keep consistency between projects.
pyproject.toml file contains configuration for:
- black - an automatic code formatter,
- isort - a tool which keeps
importstatements sorted alphabetically, and
- mypy - a static type checking tool for Python.
We also use flake8, a code linter for Python, which is configured in its own file,
time of writing, the flake8 developers have not implemented support for configuration via
Some of our applications need some runtime secrets in order to be run locally on a developer's
machine. Our convention is to place these in
secrets.env and have a template file,
secrets.env.in, which specifies which secrets are needed and where to find them.
pyproject.toml file contains configuration for
pytest and the pytest plugin which generates
code coverage information. If you need to exclude a specific file or directory from code coverage
reports, for example, you add them to this file.
pytest configuration within
pyproject.toml also contains configuration for filtering
warnings if you need to do so.
Environment variables which must be set in order to run tests are specified in
Note that this is different to all other cases where environment variables are set in
tox.ini file contains information on test environments and test jobs. For our standard
environment we have one single environment which installs dependencies via
poetry and runs
pytest. As applications grow they may have other tools which need to be run as part of a testing
process. The GitLab CI jobs run
tox which will, in turn, run all test environments. As such
any specialised test jobs can be added to
tox.ini as required and the CI will automatically run
Dockerfile file contains the definition for the production container along with the container
used for local development.
Due to the fact that we use a straightforward
docker build in GitLab CI, the production
container must be the final image definition in
For most applications, this packaging is fairly light. For the production container:
- We build any UI in a dedicated "frontend builder" image.
- We install Python dependencies in a dedicated "installed dependencies" image. We use
poetry exportto generate a list of the exact package versions along with a cryptographic hash of their contents so that we can verify that we're installing what we think we're installing.
- The "production container" image is based on the "installed dependencies" image and has the built frontend files and application code copied into it.
The "development container" image is similar but additionally has the development time dependencies installed into it. See the developer environment reference guide for more information on the difference between "development time" and "run time" dependencies.
.gitlab-ci.yml file in the developer environment is purposefully kept small so that
project-specific changes are made clear. We instead
include a standard
Currently the standard CI template lives in the webapp boilerplate repo and so ends up being copy-pasted into every deployment. We will probably move to keeping this template in a central repo in the medium term.
The standard template includes the common pipeline which includes things like pre-commit checks, running Python tests and building and pushing container images. In addition we configure a dedicated testing job which runs the test suite inside the production container. This is with an eye to CI being "helpful" and catching issues which may arise in production but which may not necessarily manifest on your own machine.
We use docker compose to provide a convenient way to stand up a development instance. The
docker-compose.yml file in the repository root contains configurations for the services required.
We use docker compose profiles to group services. For
example, all the services which need to be started to stand up a development environment are grouped
development profile. Services, such as running Django management commands, which are
intended to be run as one-offs, are put into the
utilities profile. This means that the
task need only ask docker compose to start all
The following services are started by docker compose for the
dbservice starts a PostgreSQL instance. It configures a healthcheck so that services which depend on
dbcan wait for the database to be ready before starting.
gateway-emulatorservice starts an emulator for the API Gateway proxy. This is the service which listens on
localhost:8000and forwards requests through to the webapp itself.
- For projects with UI, the
frontend_watchservice starts a nodejs container which runs
npm run watchinside the frontend directory which is mounted from the repository. The frontend is re-built each time there a change to the frontend. The built frontend is written to a docker volume called
webappservice starts the development webapp itself by launching the development container image defined in
Dockerfile. The service is marked as depending on the database and API Gateway emulator so that it is not started before those services are ready. The service has the local repository mounted so that local changes trigger Django's hot-reload support. The service has environment variables set appropriately so that the webapp uses the appropriate database. For projects with UIs, the
frontend-buildvolume is mounted read-only so that the application can serve the compiled frontend and will see changes as
In addition to the
development profile, the
docker-compose.yml file defines an analogous
webapp-production service which is similar to the
webapp service except that it does not mount
any volumes. As such it only runs the production image as built and does not support hot-reload.
Aside from these services,
docker-compose.yml defines some "utility" services which mounts local
files and sets environment variables suitably for running Django management commands and running
tests. These services are used by poe tasks such as
manage to run management commands.
We use pytest as a test harness and test discovery platform. Pytest
fixtures which are used across all tests can be defined in the
conftest.py file. Our standard
boilerplate uses the
conftest.py file to configure a PostgreSQL instance for running tests.
For our webapps there are a few standard pytest plugins we use:
- pytest-env is used to allow us to override or provide
default values for environment variables when running tests. These are configured in
pyproject.tomland we generally use this to provide "fake" values for secrets to avoid actually "real" secret values when running tests.
- pytest-django is used to provide the appropriate wiring to allow Django apps to be available when running tests and for the database to be appropriately configured. In particular, this allows us to ensure that tests which should not access the database do not do so.
- pytest-docker-tools is used to start a dedicated
PostgreSQL instance used when running tests. In some cases you may want to use an external
database instance to run tests. This is controlled by the
TEST_USE_EXTERNAL_DATABASEenvironment variable and is enacted in the
conftest.pytest configuration file.
IDEs may run pytest directly and so we both support this and enforce that it continues to work in a CI job. This is useful for developers who prefer developing code and manually triggering a specific test in their IDEs.
We use the
tox test runner as a standard entrypoint for running tests. The
tox.ini file defines
how tests should be run and configures code coverage and test report generation. We use
it allows us to use a single CI
for testing without mandating a particular test harness. It also allows additional testing tools to
be used without adding additional CI configuration.
CI aims to be helpful and splits the different was of running tests into separate jobs. So, for example you may see that your tests succeed locally but not when built into the production container. This provides a safety-net against, for example, not including a required system-level dependency in the production container.
This document has outlined a little of the "how" and the "why" of our boilerplate template. Since we expect the boilerplate to improve over time, we have intentionally remained somewhat generic in our discussion.