Run a Light Node
Run a Tenzro light node that syncs block headers from the testnet, exposes JSON-RPC and Web API endpoints, and optionally serves AI models or provides TEE services to earn TNZO.
Tier 2: Light Client
Running a light node makes you a Tier 2: Light Client — a persistent, always-on participant. Unlike Tier 1 MicroNodes (MCP/TenzroClaw), you run the actual node binary, sync block headers, manage your own identity and wallet, and can take on provider roles to earn TNZO.
| Tier | Role | Runs Binary | Earns TNZO |
|---|---|---|---|
| 1 | MicroNode (MCP/TenzroClaw) | No | No |
| 2 | Light Client / Provider | Yes | Yes (if provider) |
| 3 | Validator | Yes | Yes |
Prerequisites
- Docker installed (or Linux/macOS for binary)
- 2+ CPU cores, 4 GB RAM, 20 GB disk
- Internet connection to testnet boot nodes
Pull the Docker Image
First, pull the latest Tenzro node image from Google Artifact Registry:
docker pull us-central1-docker.pkg.dev/tenzro-infra/tenzro/tenzro-node:latestStart a Light Node
Create a data directory and start your node with Docker:
mkdir -p ~/tenzro-data
docker run -d \
--name tenzro-node \
-p 8545:8545 \
-p 8080:8080 \
-p 3001:3001 \
-p 3002:3002 \
-v ~/tenzro-data:/data \
us-central1-docker.pkg.dev/tenzro-infra/tenzro/tenzro-node:latest \
--role model-provider \
--data-dir /data \
--rpc-addr 0.0.0.0:8545 \
--boot-nodes '/ip4/10.0.0.10/tcp/9000'Flag explanations:
--role model-provider— Run as a model provider (light node that can serve AI models). Other roles:tee-provider,user(read-only light client)--data-dir /data— Persistent storage for chain data, keys, and model files--rpc-addr 0.0.0.0:8545— Bind JSON-RPC server (also starts Web API on :8080, MCP on :3001, A2A on :3002)--boot-nodes— Connect to testnet validators for block header sync- Ports: 8545 (JSON-RPC), 8080 (Web API + Faucet), 3001 (MCP Server), 3002 (A2A Protocol)
Verify Your Node
Check that your node is running and syncing:
# Check node status
curl -s http://localhost:8080/status | python3 -m json.tool
# Check block height
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"eth_blockNumber","params":[],"id":1}'
# View logs
docker logs tenzro-node --tail 20Expected output shows your node's role, current block height, peer count, and ready status:
{
"role": "ModelProvider",
"block_height": 12345,
"peer_count": 3,
"ready": true
}Join the Network
Once your node is running, join the network to provision your identity and MPC wallet, then onboard via OAuth 2.1 + DPoP to mint a bearer JWT:
# Step 1 — provision identity + wallet (CLI)
tenzro join --name "my-node"
# Output:
# Wallet: 0x7a3b...c4f1
# DID: did:tenzro:human:03aba094-...
# Step 2 — mint a DPoP-bound bearer JWT (RFC 9449)
tenzro auth onboard-human --display-name "my-node"
# → { "access_token": "eyJ...", "dpop_bound": true, "expires_in": 3600 }
# Or via RPC if you prefer curl:
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_participate","params":{"display_name":"my-node"},"id":1}'
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_onboardHuman","params":{"display_name":"my-node"},"id":2}'The bearer JWT is DPoP-bound to your Ed25519 holder key (RFC 7638 thumbprint). Pass it on every privileged call as Authorization: DPoP <jwt> alongside a per-request DPoP: <proof> header. Read-only queries (status, balance, model list) are public. Write operations (provider registration, sending transactions, staking) require the JWT. Tokens are revocable by jti (tenzro_revokeJwt) or by DID (tenzro_revokeDid, cascading through the act-chain).
Create Your Identity and Wallet
Your node auto-provisions a TDIP identity and MPC wallet on first startup. You can also create additional wallets:
# Create a new wallet
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_createWallet","params":{"key_type":"ed25519"},"id":1}'
# Request testnet TNZO from the faucet
curl -s -X POST http://localhost:8080/faucet \
-H 'Content-Type: application/json' \
-d '{"address":"YOUR_ADDRESS_HERE"}'
# Check your balance
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"eth_getBalance","params":["YOUR_ADDRESS_HERE","latest"],"id":1}'Serve an AI Model (Model Provider)
If running with --role model-provider, you can download and serve AI models to earn TNZO:
# Download a model from HuggingFace
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_downloadModel","params":{"model_id":"gemma3-270m"},"id":1}'
# Start serving the model
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_serveModel","params":{"model_id":"gemma3-270m"},"id":1}'
# Test inference
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_chat","params":{"model_id":"gemma3-270m","message":"What is the capital of France?","max_tokens":100},"id":1}'Register as a Provider
Register with the network to appear in model discovery and earn rewards:
# Register as a provider with optional staking
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_registerProvider","params":{"stake_amount":"1000"},"id":1}'
# Check provider stats
curl -s -X POST http://localhost:8545 \
-H 'Content-Type: application/json' \
-d '{"jsonrpc":"2.0","method":"tenzro_providerStats","params":{},"id":1}'Node Configuration (TOML)
For advanced configuration, create a tenzro.toml file:
[node]
role = "model-provider"
data_dir = "/data"
log_level = "info"
[rpc]
addr = "0.0.0.0:8545"
[network]
listen_addr = "/ip4/0.0.0.0/tcp/9000"
boot_nodes = ["/ip4/10.0.0.10/tcp/9000"]
[model]
max_concurrent_inferences = 4
cache_dir = "/data/models"Then start with the config file:
docker run -d \
--name tenzro-node \
-v ~/tenzro-data:/data \
-v ~/tenzro.toml:/config/tenzro.toml \
us-central1-docker.pkg.dev/tenzro-infra/tenzro/tenzro-node:latest \
--config /config/tenzro.tomlNode Roles Reference
Choose the role that matches your participation level:
| Role | Description | Earns TNZO |
|---|---|---|
user | Read-only light client, syncs headers | No |
model-provider | Serves AI models for inference | Yes |
tee-provider | Provides TEE enclave services | Yes |
Port Reference
| Port | Service | Description |
|---|---|---|
| 8545 | JSON-RPC | EVM-compatible RPC (50+ methods) |
| 8080 | Web API | REST verification, status, faucet |
| 3001 | MCP | Model Context Protocol (35 tools) |
| 3002 | A2A | Agent-to-Agent protocol |
| 9000 | P2P | libp2p gossipsub (internal) |
Stopping and Restarting
Your data persists in ~/tenzro-data, so you can safely stop and restart:
# Stop gracefully
docker stop tenzro-node
# Restart (data persists in ~/tenzro-data)
docker start tenzro-node
# Remove and recreate (data persists)
docker rm tenzro-node
docker run -d --name tenzro-node ... # (same flags as above)Next Steps
- Run a Validator Node — Become a Tier 3 validator and secure the network
- Connect via MCP — Access your node from Claude Desktop
- Build a Payment Agent — Create an autonomous AI agent with TDIP identity
- Full Documentation — Deep dive into architecture and APIs