Skip to content

pinax-network/antelope-transactions-api

Repository files navigation

Antelope Transactions API

.github/workflows/bun-test.yml

Transactions information from the Antelope blockchains, powered by Substreams

Swagger API

Usage

Method Path Query parameters
(* = Required)
Description
GET
text/html
/ - Swagger API playground
GET
application/json
/actions/account/{account} account* Actions by account
GET
application/json
/actions/date/{date} date* Actions by date
GET
application/json
/actions/name/{name} name* Actions by name
GET
application/json
/actions/number/{number} number* Actions by number
GET
application/json
/authorizations/actor/{actor} actor* Authorizations by actor
GET
application/json
/authorizations/date/{date} date* Authorizations by date
GET
application/json
/authorizations/number/{number} number* Authorizations by number
GET
application/json
/authorizations/permission/{permission} permission* Authorizations by permission
GET
application/json
/authorizations/transaction/{hash} hash* Authorizations by hash
GET
application/json
/blocks/date/{date} date* Blocks by date
GET
application/json
/blocks/hash/{hash} hash* Blocks by hash
GET
application/json
/blocks/number/{number} number* Blocks by number
GET
application/json
/receivers/date/{date} date* Receivers by date
GET
application/json
/receivers/number/{number} number* Receivers by number
GET
application/json
/receivers/receiver/{receiver} receiver* Receivers by receiver
GET
application/json
/receivers/transaction/{hash} hash* Receivers by hash
GET
application/json
/search/transactions account
action
auth
hash
number
receiver
Search transactions, requires at least one parameter
GET
application/json
/transactions/date/{date} date* Transactions by date
GET
application/json
/transactions/hash/{hash} hash* Transactions by hash

Note

All endpoints support first, skip, order_by, order_direction as additional query parameters.

Docs

Method Path Description
GET
application/json
/openapi OpenAPI specification
GET
application/json
/version API version and Git short commit hash

Monitoring

Method Path Description
GET
text/plain
/health Checks database connection
GET
text/plain
/metrics Prometheus metrics

GraphQL

Go to /graphql for a GraphIQL interface.

X-Api-Key

Use the Variables tab at the bottom to add your API key:

{
  "X-Api-Key": "changeme"
}

Additional notes

  • Don't forget to request the meta fields in the response to get access to pagination and statistics !

Requirements

API stack architecture

API architecture diagram

Setting up the database backend (ClickHouse)

Without a cluster

Example on how to set up the ClickHouse backend for sinking EOS data.

  1. Start the ClickHouse server
clickhouse server
  1. Create the transactions database
echo "CREATE DATABASE eos_transactions_v1" | clickhouse client -h <host> --port 9000 -d <database> -u <user> --password <password>
  1. Run the create_schema.sh script
./create_schema.sh -o /tmp/schema.sql
  1. Execute the schema
cat /tmp/schema.sql | clickhouse client -h <host> --port 9000 -d <database> -u <user> --password <password>
  1. Run the sink
substreams-sink-sql run clickhouse://<username>:<password>@<host>:9000/eos_transactions_v1 \
https://github.com/pinax-network/antelope-transactions-substreams/releases/download/v0.1.1/antelope-transactions-v0.1.1.spkg `#Substreams package` \
-e eos.substreams.pinax.network:443 `#Substreams endpoint` \
1: `#Block range <start>:<end>` \
--final-blocks-only --undo-buffer-size 1 --on-module-hash-mistmatch=warn --batch-block-flush-interval 100 --development-mode `#Additional flags`
  1. Start the API
# Will be available on locahost:8080 by default
antelope-transactions-api --host <host> --database eos_transactions_v1 --username <username> --password <password> --verbose

With a cluster

If you run ClickHouse in a cluster, change step 2 & 3:

  1. Create the transactions database
echo "CREATE DATABASE eos_transactions_v1 ON CLUSTER <cluster>" | clickhouse client -h <host> --port 9000 -d <database> -u <user> --password <password>
  1. Run the create_schema.sh script
./create_schema.sh -o /tmp/schema.sql -c <cluster>
      1. Follow the same steps as without a cluster.

Warning

Linux x86 only

$ wget https://github.com/pinax-network/antelope-transactions-api/releases/download/v0.2.0/antelope-transactions-api
$ chmod +x ./antelope-transactions-api
$ ./antelope-transactions-api --help                                                                          
Usage: antelope-transactions-api [options]

Transactions information from the Antelope blockchains

Options:
  -V, --version            output the version number
  -p, --port <number>      HTTP port on which to attach the API (default: "8080", env: PORT)
  --hostname <string>      Server listen on HTTP hostname (default: "localhost", env: HOSTNAME)
  --host <string>          Database HTTP hostname (default: "http://localhost:8123", env: HOST)
  --database <string>      The database to use inside ClickHouse (default: "default", env: DATABASE)
  --username <string>      Database user (default: "default", env: USERNAME)
  --password <string>      Password associated with the specified username (default: "", env: PASSWORD)
  --max-limit <number>     Maximum LIMIT queries (default: 10000, env: MAX_LIMIT)
  -v, --verbose <boolean>  Enable verbose logging (choices: "true", "false", default: false, env: VERBOSE)
  -h, --help               display help for command

.env Environment variables

# API Server
PORT=8080
HOSTNAME=localhost

# Clickhouse Database
HOST=http://127.0.0.1:8123
DATABASE=default
USERNAME=default
PASSWORD=
MAX_LIMIT=500

# Logging
VERBOSE=true

Docker environment

  • Pull from GitHub Container registry

For latest tagged release

docker pull ghcr.io/pinax-network/antelope-transactions-api:latest

For head of main branch

docker pull ghcr.io/pinax-network/antelope-transactions-api:develop
  • Build from source
docker build -t antelope-transactions-api .
  • Run with .env file
docker run -it --rm --env-file .env ghcr.io/pinax-network/antelope-transactions-api

Contributing

See CONTRIBUTING.md.

Quick start

Install Bun

$ bun install
$ bun dev

Tests

$ bun lint
$ bun test