Corelle is a simple system for reconstructing the location of tectonic plates back in geologic time. The software is compatible with GPlates, but it is designed specifically to support interactive web visualizations. It is named for the venerable dinnerware owned by everyone's grandma.
- CHANGELOG
- Demo: pre-split coastline features
- Demo: externally-provided features
- Demo: regional rotation model (note: currently broken)
- Notebook: basic usage from Python
- Notebook: advanced usage – building a paleolatitude history curve
Corelle is designed for "simplicity", in two broad categories:
Corelle faithfully implements a subset of GPlates functionality supporting the rotation of existing plate models. While GPlates is a capable and complete system for building and rendering plate models, its sophistication (and that of its PyGPlates binding) comes at a cost of complexity that inhibits installation, usage, and integration with other systems.
Corelle is primarily designed for use by geoscientists outside of the tectonics domain, and makes it simple to achieve basic rotations. Its client/server design allows use in "satellite" applications without the overhead of a full GPlates system. This allows the integration of dynamic plate reconstructions into a variety of apps and analytical processes.
Corelle's architecture balances simplicity and power — its key advance is to calculate rotations on the server but leave the last step (rotating paleogeographic features to their final positions) to be run separately by each application.
The final step in plate reconstruction, applying a vector rotation to geographic features, is mathematically simple but highly dependent on the input data — leaving it for the client makes it much quicker to rotate large amounts of data dynamically through time, since map data doesn't have to repeatedly traverse the network.
Some examples from the Seton et al., 2012 rotation model:
- Global rotations at 20 Ma weigh in at 33 kB.
- Global rotations at 1 Myr intervals from 100 Ma to the present take up 2.8 MB.
Since features are rotated at the point of use, the Corelle server is only responsible for tracking the rotations themselves, allowing for much more modular and composable systems.
Corelle's plate rotation engine is built on a PostgreSQL/PostGIS database (to track the plate dependency tree and run geospatial operations). quaternion rotation vectors are accumulated in Python. Modeled rotation vectors are sent to be applied by separate software on the client; simple client libraries for Python and Javascript are provided here.
Corelle's public-facing API is in beta but will eventually be integrated with Macrostrat's core services, which already power plate rotations in PBDB and other projects. Upcoming work will focus on integrating new plate models with these applications.
This repository contains several related components:
- An API server that provides rotations from several
GPlates
.rot
files and associated plate polygons. - A testing suite that validates conformance to GPlates results.
- The
@macrostrat/corelle
Javascript library, which implements quaternion rotations to display rotations. - An example web application that implements basic plate motions atop several common plate models.
A recent (>3.6) version of Python is required to run the backend code. A recent
version of Node.js is required to bundle the frontend. The Python module expects
to use the postgresql:///plate-rotations
database by default, but this can be
easily changed using the CORELLE_DB
environment variable.
To install the backend, run make install
in this repository. The corelle
executable should be installed on your path. make init
imports models and
feature datasets. Then corelle serve
starts the testing API server.
To build (and continuously watch) the frontend, run make dev
.
A backend API server will be started and proxied, so you don't have to run
corelle serve
.
Corelle contains an extensive set of conformance tests to ensure that it has
GPlates-compatible rotation handling, and to ensure that rotation APIs perform
correctly and performantly in both Python and PostGIS. To run the test suite,
run make test-docker
in the application directory.
Install Docker and run docker-compose up --build
in the root directory.
This will build the application, install test data, and spin up the development server.
You can run a development version by creating a .env
file containing
COMPOSE_FILE=docker-compose.yaml:docker-compose.development.yaml
in the root
directory. This will tell Docker to spin up the frontend container using settings
for auto-rebuilding.
- Fix subtle math bugs!
- On-database cache of rotations (say, at 1 Ma increments?)
- Return pre-rotated feature datasets (rather than just modern versions)
- Materialized view for split feature datasets
- Allow feature datasets to be listed
- Create a dockerized version
- Polish the frontend demo
Returns a list of available models
/api/model
Pass points as a URLEncoded, space-separated list of comma-separated lon-lat pairs.
E.g. 20,-20 10,10
to rotate two points becomes
/api/point?model=Seton2012&data=20,-20%2010,10&time=40
This route gives you the axis-angle or quaternion representation of plate rotations at the specified time, for client-side rotation of points.
/api/rotate?time=10&model=Seton2012&quaternion=true
/api/rotate?time=10&model=Seton2012&plate_id=quaternion=true
{
"axis": [0.027206502237792123, 0.013804853692062557, -0.03262231893894808],
"angle": 0.08936020386653408,
"plate_id": 311
}
{
"quaternion": [
0.9990020102870522,
0.027206502237792123,
0.013804853692062557,
-0.03262231893894808
],
"plate_id": 311
}
For right now, features are not returned pre-rotated, but this capability will
be added to the API. Instead, features are returned as-is, with
plate_id
, old_lim
, and young_lim
properties so they can be rotated client-side.
Arbitrary feature datasets (imported in advance on the backend).
The datasets are returned split on plate boundaries and so they can be
rotated on the client side.
The example below fetches the ne_110m_land
dataset.
/api/feature/ne_110m_land?model=Seton2012
TODO: allow listing of all named feature datasets.
This route returns the plate polygons features themselves.
/api/plates?model=Seton2012