Zero-Downtime, Opinionated tool for reindexing elasticsearch indices super easy and fast.
npm install -g es-reindex
To setup a tiny elasticsearch cluster on your local machine for test purposes, this is the recommended way of doing it.
docker pull elasticsearch
docker run -d -p 9200:9200 -p 9300:9300 elasticsearch
[Open localhost:9200 to see the running elasticsearch cluster]
To allow zero-downtime reindexing, we have to get the setup right for the indices. We create one main alias and indices with an ever increasing number. Example:
For setting up a products
index we create
-
an index called
products_[current_timestamp]
eg.products_170
You can usees-reindex-create-index --index=products --addtimestamp=true --body=path/to/json.json
to create such an index -
an alias for
products_170
calledproducts
You can usees-reindex-add-to-alias --alias=products --index=products_170 --action=add
to add an index to an alias and receiving the timestamp
Now we can talk to the index via products
.
Sometime later we want to make changes to the index, which requires reindexing. The procedure is as follows:
-
Create an index with the mutation called
products_[current_timestamp]
eg.products_180
You can usees-reindex-create-index --index=products --addtimestamp=true --body=path/to/json.json
to create such a new index -
Add
products_180
to theproducts
alias and log the timestamp (!important)
You can usees-reindex-add-to-alias --alias=products --index=products_180 --action=add
to add an index to an alias and receiving the timestamp -
Now run the reindexing from
products_170
toproducts_180
from the earliest document inproducts_170
to the timestampproducts_180
got added to the alias. (Use scan-and-scroll with bulk inserts for_source
and parallelize the shit out of it. Use this tool for it.) -
Now remove the old index
products_170
from the aliasproducts
and your are done reindexing
You can usees-reindex-add-to-alias --alias=products --index=products_170 --action=remove
to remove an index from an alias
Simply run the following command to reindex your data:
$ es-reindex -f http://192.168.1.100:9200/old_index/old_type -t http://10.0.0.1:9200/new_index/new_type
You can omit {new_index} and {new_type} if new index name and type name same as the old
$ es-reindex -f http://192.168.1.100:9200/old_index/old_type -t http://10.0.0.1:9200
Some times, you may want to reindex the data by your custom indexer script(eg. reindex the data to multiple index based on the date field). The custom indexer feature can help you out on this situation.
To use this feature, create your own indexer.js
var moment = require('moment');
module.exports = {
index: function(item, options) {
return [
{index:{_index: 'tweets_' + moment(item._source.date).format('YYYYMM'), _type:options.type || item._type, _id: item._id}},
item._source
];
}
};
Simply pass this script's path, it will work.
$ es-reindex -f http://192.168.1.100:9200/old_index/old_type -t http://10.0.0.1:9200/ indexer.js
Add custom query in indexer.js
var moment = require('moment');
module.exports = {
query:{
query:{
term:{
user: 'Garbin'
}
}
},
index: function(item, options) {
return [
{index:{_index: 'tweets_' + moment(item._source.date).format('YYYYMM'), _type:options.type || item._type, _id: item._id}},
item._source
];
}
};
Then
$ es-reindex -f http://192.168.1.100:9200/old_index/old_type -t http://10.0.0.1:9200/ indexer.js
Only the user Garbin's data will be indexed
Will take a very very long time to reindex a very big index, you may want to make it small, and reindex it parallelly. Now you can do this with the "Shard" feature.
var moment = require('moment');
module.exports = {
sharded:{
field: "created_at",
start: "2014-01-01",
end: "2014-12-31",
interval: 'month' // day, week, or a number of day, such as 7 for 7 days.
},
index: function(item, options) {
return [
{index:{_index: 'tweets_' + moment(item._source.date).format('YYYYMM'), _type:options.type || item._type, _id: item._id}},
item._source
];
}
};
The sharded config will make the big index into 12 shards based on created_at field and reindex it parallelly.
Then
$ es-reindex -f http://192.168.1.100:9200/old_index/old_type -t http://10.0.0.1:9200/ indexer.js
Added support for promises so that you can request data from other parts of the database
module.exports = {
index: function (item, opts, client) {
var indexData = {
index: {
_index: opts.index,
_type: item._type,
_id: item._id
}
};
// With the client we can access other parts of our database
return client.mget({
index: 'media',
type: 'movies',
body: {
ids: item._source.favoriteMovieIDs
}
}).then(function (response) {
item._source.faveMovies = response.docs.map(function (movie) {
return {
name: movie._source.name,
id: movie._source.id
};
});
return [indexData, item._source];
});
}
}
Then
$ es-reindex -f http://192.168.1.100:9200/old_index/old_type -t http://10.0.0.1:9200/ -m true indexer.js
You will see the reindex progress for every shard clearly
Have fun!
Thanks to Elasticsearch Reindex for providing the base.
elasticsearch-reindex is licensed under the MIT License.