Skip to content

Web Application

Prerequisites

  • Docker
  • Python 3.11
  • Poetry
  • Pip
  • Azure Functions Core Tools v4
  • Azure CLI

Getting Started

The most direct route to running the application locally is using the Docker quickstart.

The repository contains a docker-compose for development, so after you have setup the configuration, just run docker-compose up -d to start the application stack. This runs the database, Azurite emulator, and will build and run the web app container.

Docker will mount the api directory to the web app container, so any changes in the Python code will be reflected in the running application.

You will need to run some commands using the web app entrypoint, to access the container run: docker exec -it carrot-mapper-web-1 bash

Configuration

The application is configured through environment variables, and a local.settings.json file.

Rename the existing sample-env.txt to .env, and sample-local.settings.json to local.settings.json to use the default values with the Docker setup.

Database Setup

The application stack interacts with a PostgreSQL Server database, and uses code-first migrations for managing the database schema.

OMOP Tables

You need a pre-seeded OMOP CDM database, with the schema omop. See OMOP quickstart for how to get this running.

Web App Tables

When setting up a new environment, or running a newer version of the codebase if there have been schema changes, you need to run the migrations against your web app database.

Inside the web app container api directory, run: python manage.py migrate.

Seed Data

You need to seed the web app database with the OMOP table and field names, inside the web app container api directory run: python manage.py loaddata mapping.

To add a new admin user run: python manage.py createsuperuser.

Azure Functions

Whilst the rest of the stack runs in containers, the worker functions run directly in your Python environment.

To create the storage containers and queues, use the Azure CLI:

  • az storage container create -n scan-reports --connection-string <CONNECTIONSTRING>
  • az storage container create -m data-dictionaries --connection-string <CONNECTIONSTRING>
  • az storage queue create -n nlpqueue-local --connection-string <CONNECTIONSTRING>
  • az storage queue create -n uploadreports-local --connection-string <CONNECTIONSTRING>
  • az storage queue create -n rules-local --connection-string <CONNECTIONSTRING>

To run the functions, in the project root:

  1. In the web app Django http://localhost:8000/admin, generate a new auth token for the admin user.
  2. Add the token to local.settings.json : AZ_FUNCTION_KEY
  3. Install the dependencies in app/workers: poetry install
  4. Run the functions: poetry run func start

You should now be setup to run the user quickstart.

Python Environment

You can also run the web app directly in a Python environment instead of the Docker image.

Prequisites:

  • A separate Poetry environment for the web app.
  • graphviz package installed.
  • Node v12.18.3

  • Change the environment configuration to point to the running docker containers, for example localhost instead of azurite.

  • Inside the app/react-client-app directory, install the npm dependencies: npm i
  • Change the snowpack.config.js to use a relative file path:
    {
      buildOptions: {
          out: '../api/static/javascript/react',
        }
    }
    
  • Build the react app: npm run build.
  • Inside the app/api directory, install the Python dependencies: poetry install.
  • Collect the Django static files: poetry run python manage.py collectstatic.
  • Run the app: poetry run python manage.py runserver.

If you're using VSCode, you can use the Workspaces to manage your Python virtual environments, and the debugging tool to run the web app and functions.