This project provides Docker images to periodically back up a Elasticsearch indices to AWS S3, and to restore from the backup as needed.
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.12.2
environment:
- node.name=elasticsearch
- cluster.name=elastic-base-cluster
- discovery.type=single-node
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- cluster.routing.allocation.disk.threshold_enabled=false
- xpack.security.enabled=false
ulimits:
memlock:
soft: -1
hard: -1
ports:
- "9200:9200"
- "9300:9300"
backup:
image: langtechbsc/multi-elasticseach-backup-s3:latest
environment:
SCHEDULE: '@weekly' # optional
BACKUP_KEEP_DAYS: 7 # optional
PASSPHRASE: passphrase # optional
S3_REGION: region
S3_ACCESS_KEY_ID: key
S3_SECRET_ACCESS_KEY: secret
S3_BUCKET: my-bucket
S3_PREFIX: backup
ELASTICSEARCH_HOST: elasticsearch:9200
# ELASTICSEARCH_USER: user
# ELASTICSEARCH_PASSWORD: password
- The
SCHEDULE
variable determines backup frequency. See go-cron schedules documentation here. Omit to run the backup immediately and then exit. - If
PASSPHRASE
is provided, the backup will be encrypted using GPG. - Run
docker exec <container name> sh backup.sh
to trigger a backup ad-hoc. - If
BACKUP_KEEP_DAYS
is set, backups older than this many days will be deleted from S3. - Set
S3_ENDPOINT
if you're using a non-AWS S3-compatible storage provider.
docker exec <container name> sh restore.sh
Note
If your bucket has more than a 1000 files, the latest may not be restored -- only one S3 ls
command is used
docker exec <container name> sh restore.sh <timestamp>
DOCKER_BUILDKIT=1 docker build --build-arg ALPINE_VERSION=3.14 .
cp template.env .env
# fill out your secrets/params in .env
docker compose up -d
This project is a fork and re-structuring of @schickling's postgres-backup-s3.
Ability to backup Elasticsearch indices.
- implementation of elasticsearch-dump.
- use multielasticdump for backuping all indices.