Skip to content

Commit

Permalink
Update developer documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
steventux committed Jan 29, 2025
1 parent a38b6f6 commit e2554e1
Showing 1 changed file with 28 additions and 71 deletions.
99 changes: 28 additions & 71 deletions docs/developer-guides/Local_development_setup.md
Original file line number Diff line number Diff line change
@@ -1,88 +1,45 @@
# Local development setup

This project uses Azure Functions to process data and send messages to the NHS Notify API Sandbox environment.
Docker containers can be used to run the Azure functions in a local development environment.
Docker compose is used to run the Azure functions and Azurite Azure Storage Emulator.
Alternatively, you can run the Azure functions locally using Azure Functions Core Tools.

## Prerequisites

- Python 3.11
- Docker
- Docker Compose (included with Docker Desktop)
- Azure Functions Core Tools (Optional, see below)

## Using Docker Compose to run Azure Functions and Azurite

This is the preferred method for running the Azure functions locally.

### Prepare environment variables

Copy `.env.example` to `.env.local` in the root of the project and populate with valid values. You may need to ask another developer for these values.

### Start Azurite and Azure functions

To start the Azure functions and Azurite, run the following command:

```bash
docker compose --env-file .env.local up
```

### Testing a file upload
- Python 3.11 preferred
- `pipenv` dependency management for Python see [pipenv docs](https://pypi.org/project/pipenv/)
- [asdf tool version manager](https://asdf-vm.com/guide/getting-started.html) and [asdf postgres plugin for PostgreSQL](https://github.com/smashedtoatoms/asdf-postgres) OR install PostgreSQL manually on your machine.

There is a convenience script to upload a test file to the Azurite storage emulator. This script will upload a test file to the `pilot-data` container in the Azurite storage emulator.
## Setup

```bash
pip install azure-storage-blob python-dotenv
python dependencies/azurite/send_file.py dependencies/azurite/example.csv
```
1. Clone the repository
2. Install dependencies and initialize the virtual environment

You should see the Azure functions being triggered in the console output and success messages from the functions.

## Manual setup using Azure Functions Core Tools to run functions outside of Docker

It's also possible to run one or more functions locally without using Docker. This can be useful for debugging or testing individual functions.

### Setup Python virtual environment

```bash
python -m venv .venv
source .venv/bin/activate
```

### Install dependencies for a specific Azure function
```bash
pipenv install --dev
pipenv shell
```

Azure function code is located in `src/functions/<function_name>`. To install dependencies for a specific function, run the following command:
3. Create a `.env.local` file in the root of the project using `.env.example` as a template (ask a team member for the values)
4. Create a PostgreSQL database and set the connection values in the `.env.local` file
5. Run the development start up script to create the database tables and seed the database

```bash
cd src/functions/<function_name>
pip install -r requirements.txt
```
```bash
./dev.sh
```

### Start the Azure function locally
You should see the Azure function app start in the console.
6. Try the URL `http://localhost:7071/api/healtcheck` in your browser to confirm the function app is running.

To start an Azure function locally, run the following command:
## Running the tests

```bash
func start --verbose -p <port>
```
To run the tests locally:

### Running end-to-end tests
1. Install dependencies and initialize the virtual environment

To run an end-to-end test, you will need to run both functions locally. Functions default to port 7071, the local settings in this project use ports 7071 and 7072 respectively. You can change the port by using the `-p` flag.
```bash
pipenv install --dev
pipenv shell
```

1. Start Azurite
2. Start the **process-pilot-data** function with the default port (omit the -p flag)
3. Start the **notify** function on port 7072 (use the `-p 7072` flag)
4. Start Storage Explorer and connect to the Azurite Emulator
5. Create a container called `pilot-data` via the Storage Explorer
6. Create or update a blob in the `pilot-data` container.
The CSV file contents should look like (the NHS numbers are valid example data):
2. Run the test script

```csv
9990548609,1971-09-07,2024-12-12,10:00,123 High St. London, Mammogram
9435732992,1980-02-04,2024-12-13,11:00,321 South St. London, Mammogram
9435792170,1969-01-29,2024-11-24,14:30,45 North St. London, Mammogram
```bash
./test.sh
```

7. Monitor function console output for logs, you should see the blob update trigger the **process-pilot-data** function and this should then trigger the **notify** function

0 comments on commit e2554e1

Please sign in to comment.