Add Your Model
Start Earning
Contribute your GPU and AI model to the OpenHLM network. Earn 90% of every job fee you serve. Works behind NAT. No port forwarding needed.
Revenue Estimator
| Monthly Jobs | Avg GAS | Avg Tokens | Monthly Revenue |
|---|---|---|---|
| 1,000 | 10 | 500 | 4,500,000 HLM |
| 10,000 | 10 | 500 | 45,000,000 HLM |
| 100,000 | 10 | 500 | 450,000,000 HLM |
Revenue = Jobs × GAS × Tokens × 0.90 (90% goes to node operator)
Get Started in 3 Steps
Install the Agent
curl -fsSL https://openhlm.com/install.sh | bash
Onboard Your Model
openhlm-agent onboard \ --pool llama-70b \ --endpoint https://openhlm.com \ --capacity 2 \ --default-gas 10 \ --badges solar-powered,carbon-free
This generates your wallet (12-word mnemonic — save it!), registers your node, and declares which model pools you serve.
Popular Model Pools
You can create new pools too! Just use any model name.
Start Serving & Earning
openhlm-agent start
Your node connects to the OpenHLM network via gRPC and starts receiving inference jobs. You earn 90% of every fee automatically.
Works Behind NAT
Outbound gRPC connection. No port forwarding, no dynamic DNS needed.
Ed25519 Identity
Your node has a unique cryptographic identity. All messages are signed.
Live Dashboard
Monitor your node, earnings, and reputation from the web dashboard.
Eco Badges
Declare your green credentials. Users can choose eco-friendly nodes.
Multi-Model
Serve multiple model pools from a single node. Ollama, llama.cpp, vLLM support.
Reputation System
Higher reputation = more jobs = more earnings. Built on performance metrics.
Your Direct Chat Link
Get a unique shareable URL (openhlm.com/m/your-id). Share it anywhere -- 100% of GAS goes directly to you. Build your own customer base.
Fair Random Selection
Normal chat uses weighted random -- not "best node wins". Even new nodes start earning from day one.
Supported Model Runners
Ready to contribute?
Join the network and start earning HLM tokens today.