Skip to content

Commit

Permalink
Merge pull request #15 from Klantinteractie-Servicesysteem/edit-readme
Browse files Browse the repository at this point in the history
Edit readme
  • Loading branch information
felixcicatt authored Jan 18, 2024
2 parents 624a4e3 + d911e1a commit d1d009a
Show file tree
Hide file tree
Showing 2 changed files with 14 additions and 10 deletions.
19 changes: 11 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,26 @@
# KISS-Elastic-Sync

## Background
## Introduction
KISS offers the posibility to search for information within specific sources. This search functionality is using Elasticsearch. The KISS-Elastic-Sync-tool is used to create the necessary engines in a an Elasticsearch installation.

Two types of sources are indexed in Elasticsearch to allow them to be easily searched from KISS:
- Websites (by running this tool to set up a `crawler` in Enterprise Search)
- Structured sources (by scheduling this tool to synchronize data from the source to an `index` in Elasticsearch)

## Run locally
1. Make a copy of .env.local.example, rename it .env.local and fill in the required secrets.
2. Set up a port forward for Enterprise Search, e.g.: `kubectl port-forward service/kiss-ent-http 3002`
3. Set up a port forward for Elasticsearch, e.g.: `kubectl port-forward service/kiss-es-http 9200`
4. Build the tool using docker-compose: `docker compose build`
4. Run the tool using docker-compose: `docker compose --env-file ./.env.local run kiss.elastic.sync [ARGS...]`

## When you first set up a source
This tool does the following automatically:
1. Create a Enterprise Search `engine` for the source. For websites, a `crawler` is created and run. For structured sources, an `index` is created and linked to the `engine`.
1. Create a `meta engine`. This is used to aggregate multiple sources. The `engine` from step 1 is linked to this `meta engine`.

## Relevance tuning
You can use `Relevance tuning` from Kibana on the `meta engine`.
You can use `Relevance tuning` from Kibana on the `meta engine`. See also the [KISS-documenation (in Dutch)](https://kiss-klantinteractie-servicesysteem.readthedocs.io/en/latest/CONFIGURATIE/#configuratie-van-elasticsearch-voor-kiss).

## Supported structured sources
- SDG Producten
Expand Down Expand Up @@ -53,9 +62,3 @@ Examples of how to schedule a cron job in Kubernetes with these arguments [can b
| VAC_OBJECTTYPES_BASE_URL | The base url for the Object Types API to retrieve the VAC object type |
| VAC_OBJECTTYPES_TOKEN | The token to connect to the Object Types API to retrieve the VAC object type |

## Run locally
1. Make a copy of .env.local.example, rename it .env.local and fill in the required secrets.
2. Set up a port forward for Enterprise Search, e.g.: `kubectl port-forward service/kiss-ent-http 3002`
3. Set up a port forward for Elasticsearch, e.g.: `kubectl port-forward service/kiss-es-http 9200`
4. Build the tool using docker-compose: `docker compose build`
4. Run the tool using docker-compose: `docker compose --env-file ./.env.local run kiss.elastic.sync [ARGS...]`
5 changes: 3 additions & 2 deletions deploy/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# Deployen cronjobs

Voor elke cronjob is een yaml bestand in deze folder opgenomen.
Om de cronjob te installeren voer je het volgende commando uit op het cluster:
The syncing of sources to Elasticsearch is done by cronjobs. For each cronjob (for each supported source) a yaml file can be found in this folder.

To install a cronjob you can run the following command on the cluster that KISS and Elasticsearch are running on:
`kubectl apply -f .\cronjob-kennisartikelen.yaml`

0 comments on commit d1d009a

Please sign in to comment.