Prerequisites
- Active Claude subscription (Pro, Max, or API access)
- Gigabrain API key from your Profile
- Node.js or Python installed
Setup
Install Claude Code
Install Claude Code globally via npm:
Copy
npm install -g @anthropic-ai/claude-code
Create your project
Set up a new trading bot project:Create
Copy
mkdir sentiment-analyzer
cd sentiment-analyzer
.env file:Copy
GIGABRAIN_API_KEY=gb_sk_your_key_here
Configure Claude Code
Create
CLAUDE.md in your project root to teach Claude about Gigabrain API:Copy
# Gigabrain Trading Bot Development
## API Details
- Base URL: `https://api.gigabrain.gg`
- Endpoint: `/v1/chat` (POST)
- Auth header: `Authorization: Bearer gb_sk_...`
- Response field: `content` (not `message`)
- Timeout: Minimum 600 seconds
## Rate Limits
- 60 requests/minute
- Handle 429 errors with exponential backoff
- Monitor `X-RateLimit-Remaining-Minute` header
## Query Patterns
For structured data, always add "Respond as JSON" and specify exact fields:
```
Get fear and greed index. Respond as JSON with:
fear_greed_index, fear_greed_label, btc_dominance
```
## The Brain - Specialists
- **Macro**: DXY, VIX, yields, Fed Funds, S&P 500, risk regime
- **Microstructure**: Funding rates, OI, liquidations, long/short ratios
- **Fundamentals**: TVL, protocol revenue, active users, token metrics
- **Market State**: Fear & Greed, narratives, sentiment, regime shifts
- **Price Movement**: Technical analysis, EMAs, RSI, MACD, trade setups
- **Trenches**: Micro-cap tokens, social momentum, KOL mentions
- **Polymarket**: Prediction markets, odds, volume, resolution dates
## Error Handling
- 401: Invalid API key
- 429: Rate limit exceeded (retry after `Retry-After` seconds)
- 500/503/504: Retry with exponential backoff
- Always log `session_id` for debugging
## Best Practices
- Use environment variables for API keys
- Cache responses when appropriate
- Implement retry logic for transient errors
- Break complex queries into smaller requests
- Specify exact JSON fields for consistency
Example: Market Sentiment Analyzer
Build a tool that aggregates multiple sentiment indicators into a single market regime score.Copy
import os
import requests
import json
from datetime import datetime
from dotenv import load_dotenv
load_dotenv()
API_KEY = os.getenv("GIGABRAIN_API_KEY")
BASE_URL = "https://api.gigabrain.gg"
class SentimentAnalyzer:
def **init**(self):
self.session = requests.Session()
self.session.headers.update({
"Authorization": f"Bearer {API_KEY}",
"Content-Type": "application/json"
})
def query_gigabrain(self, message):
"""Query Gigabrain API with error handling"""
try:
response = self.session.post(
f"{BASE_URL}/v1/chat",
json={"message": message},
timeout=600
)
if response.status_code == 200:
data = response.json()
return json.loads(data["content"])
elif response.status_code == 429:
retry_after = int(response.headers.get("Retry-After", 60))
print(f"⏳ Rate limited. Waiting {retry_after}s...")
import time
time.sleep(retry_after)
return self.query_gigabrain(message)
else:
raise Exception(f"API Error {response.status_code}: {response.text}")
except Exception as e:
print(f"❌ Error: {e}")
return None
def get_fear_greed(self):
"""Fetch Fear & Greed Index"""
query = """
Get BTC fear and greed index. Respond as JSON with:
fear_greed_index, fear_greed_label, btc_dominance,
altcoin_season_index, market_cap_total, market_cap_change_24h
"""
return self.query_gigabrain(query)
def get_funding_sentiment(self):
"""Analyze funding rates for sentiment"""
query = """
Get funding rates for BTC, ETH, SOL. Respond as JSON array with:
symbol, funding_rate, open_interest, long_short_ratio
"""
return self.query_gigabrain(query)
def get_narratives(self):
"""Fetch trending narratives"""
query = """
Get current crypto narratives ranked by momentum. Respond as JSON array with:
narrative, momentum_score, top_tokens, sentiment
"""
return self.query_gigabrain(query)
def calculate_regime_score(self, fear_greed, funding, narratives):
"""Calculate overall market regime score (0-100)"""
scores = []
# Fear & Greed contribution (0-40 points)
if fear_greed:
fg_index = fear_greed.get("fear_greed_index", 50)
scores.append(fg_index * 0.4)
# Funding rate contribution (0-30 points)
if funding:
avg_funding = sum(abs(t["funding_rate"]) for t in funding) / len(funding)
# High funding = extreme positioning = lower score
funding_score = max(0, 30 - (avg_funding * 1000))
scores.append(funding_score)
# Narrative momentum contribution (0-30 points)
if narratives:
avg_momentum = sum(n["momentum_score"] for n in narratives[:3]) / 3
scores.append(avg_momentum * 0.3)
return sum(scores) if scores else 50
def get_regime_label(self, score):
"""Convert score to regime label"""
if score >= 70:
return "🟢 RISK ON"
elif score >= 50:
return "🟡 NEUTRAL"
elif score >= 30:
return "🟠 CAUTIOUS"
else:
return "🔴 RISK OFF"
def analyze(self):
"""Run complete sentiment analysis"""
print("🧠 Gigabrain Market Sentiment Analyzer")
print("=" * 60)
print(f"Timestamp: {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}\n")
# Fetch all data
print("📊 Fetching market data...")
fear_greed = self.get_fear_greed()
funding = self.get_funding_sentiment()
narratives = self.get_narratives()
# Display results
if fear_greed:
print(f"\n📈 Fear & Greed Index: {fear_greed['fear_greed_index']} ({fear_greed['fear_greed_label']})")
print(f" BTC Dominance: {fear_greed['btc_dominance']}%")
print(f" Altcoin Season Index: {fear_greed['altcoin_season_index']}")
if funding:
print(f"\n💰 Funding Rates:")
for token in funding:
rate_pct = token['funding_rate'] * 100
direction = "LONG" if rate_pct > 0 else "SHORT"
print(f" {token['symbol']}: {rate_pct:.4f}% ({direction} paying)")
if narratives:
print(f"\n🔥 Top Narratives:")
for i, narrative in enumerate(narratives[:3], 1):
print(f" {i}. {narrative['narrative']} (Momentum: {narrative['momentum_score']}/100)")
print(f" Tokens: {', '.join(narrative['top_tokens'])}")
# Calculate regime
regime_score = self.calculate_regime_score(fear_greed, funding, narratives)
regime_label = self.get_regime_label(regime_score)
print(f"\n{'=' * 60}")
print(f"🎯 MARKET REGIME: {regime_label}")
print(f" Score: {regime_score:.1f}/100")
print(f"{'=' * 60}")
return {
"timestamp": datetime.now().isoformat(),
"regime_score": regime_score,
"regime_label": regime_label,
"fear_greed": fear_greed,
"funding": funding,
"narratives": narratives
}
if **name** == "**main**":
analyzer = SentimentAnalyzer()
result = analyzer.analyze()
# Save to file
with open("sentiment_report.json", "w") as f:
json.dump(result, f, indent=2)
print(f"\n💾 Report saved to sentiment_report.json")
Using Claude Code Effectively
Example Prompts
Ask Claude Code to help you build trading tools: Build a liquidation tracker:Copy
Create a script that monitors liquidations from Gigabrain API and sends
alerts when total liquidations exceed $500M in 24h. Use the API patterns
from CLAUDE.md. Include error handling and rate limiting.
Copy
Build a web dashboard that displays trending crypto narratives from
Gigabrain with momentum scores. Auto-refresh every 5 minutes.
Use React and the Gigabrain API.
Copy
Create a tool that analyzes BTC across 1H, 4H, and 1D timeframes using
Gigabrain's Price Movement specialist. Compare technical indicators and
generate a consensus signal.
Best Practices
1. Structured Queries
Always specify exact JSON fields for consistency:Copy
# Good
query = "Get funding rates for top 10 perps. Respond as JSON array with: symbol, funding_rate, open_interest"
# Bad
query = "What are the funding rates?"
2. Error Recovery
Implement robust error handling:Copy
def safe_api_call(message, retries=3):
for attempt in range(retries):
try:
response = requests.post(...)
if response.status_code == 200:
return response.json()
elif response.status_code in [500, 503, 504]:
time.sleep(2 ** attempt)
continue
except requests.exceptions.Timeout:
if attempt < retries - 1:
continue
raise
3. Response Caching
Cache responses to reduce API calls:Copy
import time
from functools import lru_cache
@lru_cache(maxsize=128)
def get_cached_data(query, ttl_hash):
# ttl_hash changes every N seconds to invalidate cache
return query_gigabrain(query)
# Use with TTL
ttl = 300 # 5 minutes
ttl_hash = int(time.time() / ttl)
data = get_cached_data(query, ttl_hash)
Next Steps
Brain API Overview
Complete reference for all agents and data types
REST API Docs
Authentication, endpoints, and error handling
Cursor Integration
Build trading bots with Cursor AI
Windsurf Integration
Use Windsurf Cascade with Gigabrain API
Gigabrain provides market intelligence tools, not financial advice. Implement
proper risk management in all trading applications. See the Risk
Disclosure.