Tokens information from all EVM blockchains, powered by Substreams
Method | Path | Query parameters (* = Required) |
Description |
---|---|---|---|
GET text/html |
/ |
- | Swagger API playground |
GET application/json |
/chains |
limit page |
Information about the chains and latest head block in the database |
GET application/json |
/{chain}/balance |
block_num contract account* limit page |
Balances of an account. |
GET application/json |
/{chain}/holders |
contract* limit page |
List of holders of a token |
GET application/json |
/{chain}/supply |
block_num contract* limit page |
Total supply for a token |
GET application/json |
/{chain}/tokens |
contract symbol name limit page |
Get info about available tokens |
GET application/json |
/{chain}/transfers |
block_range from to contract limit page |
All transfers related to a token |
GET application/json |
/{chain}/transfers/{trx_id} |
limit page |
Specific transfer related to a token |
Method | Path | Description |
---|---|---|
GET application/json |
/openapi |
OpenAPI specification |
GET application/json |
/version |
API version and Git short commit hash |
Method | Path | Description |
---|---|---|
GET text/plain |
/health |
Checks database connection |
GET text/plain |
/metrics |
Prometheus metrics |
- ClickHouse, databases should follow a
{chain}_tokens_{version}
naming scheme. Database tables can be setup using theschema.sql
definitions created by thecreate_schema.sh
script. - A Substream sink for loading data into ClickHouse. We recommend Substreams Sink ClickHouse or Substreams Sink SQL. You should use the generated
protobuf
files to build your substream. This Token API makes use of theerc20-substreams
substream. - A Substreams API Token provider to stream blockchains Data.
Example on how to set up the ClickHouse backend for sinking EOS data.
- Start the ClickHouse server
clickhouse server
- Create the token database
echo "CREATE DATABASE eth_tokens_v1" | clickhouse client -h <host> --port 9000 -d <database> -u <user> --password <password>
- Run the
create_schema.sh
script
./create_schema.sh -o /tmp/schema.sql
- Execute the schema
clickhouse client -h <host> --port 9000 -d <database> -u <user> --password <password> --multiquery < ./schema.sql
- Run the sink
export SUBSTREAMS_TOKEN= "YOUR_SUBSTREAMS_TOKEN"
substreams-sink-sql run clickhouse://<username>:<password>@<host>:9000/eth_tokens_v1 \
https://github.com/pinax-network/erc20-substreams/releases/download/v0.0.2/erc20-substreams-v0.0.2.spkg `#Substreams package` \
-e eth.substreams.pinax.network:443 `#Substreams endpoint` \
1: `#Block range <start>:<end>` \
--undo-buffer-size 1 --on-module-hash-mistmatch=warn --batch-block-flush-interval 100 --development-mode `#Additional flags`
- Start the API
# Will be available on locahost:3000 by default, Make sure --database exclude chains
erc20-token-api --host <host> --database tokens_v1 --username <username> --password <password> --verbose
If you run ClickHouse in a cluster, change step 2 & 3:
- Create the token database
echo "CREATE DATABASE eth_tokens_v1 ON CLUSTER <cluster>" | clickhouse client -h <host> --port 9000 -d <database> -u <user> --password <password>
- Run the
create_schema.sh
script
./create_schema.sh -o /tmp/schema.sql -c <cluster>
Warning
Linux x86 only
$ wget https://github.com/pinax-network/erc20-token-api/releases/download/v1.0.1/erc20-token-api
$ chmod +x ./erc20-token-api
$ ./erc20-token-api --help
Usage: erc20-token-api [options]
Token balances, supply and transfers from the Antelope blockchains
Options:
-V, --version output the version number
-p, --port <number> HTTP port on which to attach the API (default: "8080", env: PORT)
--hostname <string> Server listen on HTTP hostname (default: "localhost", env: HOSTNAME)
--host <string> Database HTTP hostname (default: "http://localhost:8123", env: HOST)
--database <string> The database to use inside ClickHouse (default: "default", env: DATABASE)
--username <string> Database user (default: "default", env: USERNAME)
--password <string> Password associated with the specified username (default: "", env: PASSWORD)
--max-limit <number> Maximum LIMIT queries (default: 10000, env: MAX_LIMIT)
-v, --verbose <boolean> Enable verbose logging (choices: "true", "false", default: false, env: VERBOSE)
-h, --help display help for command
# API Server
PORT=8080
HOSTNAME=localhost
# Clickhouse Database
HOST=http://127.0.0.1:8123
DATABASE=default
USERNAME=default
PASSWORD=
TABLE=
MAX_LIMIT=500
# Logging
VERBOSE=true
- Pull from GitHub Container registry
For latest tagged release
docker pull ghcr.io/pinax-network/erc20-token-api:latest
For head of main
branch
docker pull ghcr.io/pinax-network/erc20-token-api:develop
- Build from source
docker build -t erc20-token-api .
- Run with
.env
file
docker run -it --rm --env-file .env ghcr.io/pinax-network/erc20-token-api
See CONTRIBUTING.md
.
Install Bun
$ bun install
$ bun dev
Tests
$ bun lint
$ bun test