Configuration & Environment
To successfully run the Trading System, you must configure the environment variables, set up the local or cloud LLM backend, and initialize the data connectors.
Prerequisites
Before starting the configuration, ensure you have the following installed:
- Python 3.10+
- Node.js & npm (for the dashboard)
- Ollama (optional, for local LLM execution)
- Git
Environment Variables
The system relies on a .env file located in the root directory. Create this file and populate it with the following keys based on your chosen providers:
# LLM Provider Keys
GROQ_API_KEY=your_groq_api_key_here
# Data Provider Keys
NEWSAPI_KEY=your_newsapi_key_here
# Application Settings
DATABASE_URL=sqlite:///./trading_system.db
LOG_LEVEL=INFO
Configuration Details
| Variable | Description | Required |
| :--- | :--- | :--- |
| GROQ_API_KEY | Required if using Groq as the LLM backend (e.g., Llama 3 models). | Optional |
| NEWSAPI_KEY | Enables NewsAPI for sentiment analysis. If omitted, the system falls back to free RSS feeds. | Optional |
| DATABASE_URL | SQLAlchemy connection string for storing backtests and trade logs. | Yes |
LLM Backend Selection
The system supports two primary backends for the trading agents. You can toggle between these when initializing the TradingOrchestrator.
1. Ollama (Local)
Ideal for privacy and local testing. The system defaults to models like deepseek-r1.
- Setup: Download Ollama and run
ollama serve. - Model Pull:
ollama pull deepseek-r1(or your preferred model). - Usage:
orchestrator = TradingOrchestrator(backend="ollama")
2. Groq (Cloud)
Ideal for high-speed inference and larger models.
- Setup: Obtain an API key from the Groq Console.
- Usage:
orchestrator = TradingOrchestrator(backend="groq")
Knowledge Base & RAG Setup
The Retrieval-Augmented Generation (RAG) system processes documents in the knowledge/ folder to provide agents with domain-specific context.
- Populate Knowledge: Place
.md,.txt, or.pdffiles into subfolders withinknowledge/(e.g.,technical_analysis/,risk_management/). - Index Documents: Run the indexer to generate the vector store.
from agents.rag.indexer import KnowledgeIndexer indexer = KnowledgeIndexer() indexer.index_all()
[!NOTE] For PDF support, ensure
PyPDF2is installed via pip. The system is optimized to prefer Markdown summaries for higher-quality extraction.
Dashboard Configuration
The dashboard consists of a FastAPI backend and a Vite/React frontend.
Backend Setup
The backend serves the API and handles WebSocket connections for real-time updates.
cd dashboard/backend
pip install -r requirements.txt
uvicorn main:app --reload --port 8000
Frontend Setup
The frontend communicates with the backend via http://localhost:8000.
cd dashboard/frontend
npm install
npm run dev
By default, the frontend runs on http://localhost:5173. If you change the backend port, you must update the API_BASE constant in dashboard/frontend/src/App.jsx.
Data Connector Fallbacks
The system is designed to be resilient. The NewsAPIConnector implements a tiered fallback logic:
- Google News RSS / Crypto RSS: Used by default (no API key required).
- NewsAPI: Triggered if an API key is present in the environment variables and RSS sources fail or are insufficient.
No additional configuration is required to use the free tier, but performance and data granularity are significantly improved with a valid NEWSAPI_KEY.