Persistence & Database
The system utilizes a multi-tiered persistence strategy to handle structured trading data, unstructured domain knowledge, and session-based backtest reports. This architecture ensures that agent reasoning is preserved for auditability, performance metrics are available for the dashboard, and the RAG (Retrieval-Augmented Generation) system has access to indexed financial literature.
Database Architecture
The application relies on SQLAlchemy for relational data management and Vector Stores (via LangChain) for document retrieval.
Relational Models (SQLAlchemy)
The core database tracks the lifecycle of trading sessions and individual executions. These models are primarily used by the FastAPI backend to serve the trading dashboard.
- Backtest: Stores the high-level results of a simulation, including the initial capital, final portfolio value, and performance metrics (Sharpe Ratio, Max Drawdown). It also stores the
equity_curveas a JSON blob for time-series visualization. - Trade / BacktestTrade: Captures granular details for every buy/sell action.
- Metadata tracked: Timestamp, asset, action (BUY/SELL), price, and quantity.
- LLM Insights: Crucially, these models store the
reasoningandconfidencescores generated by the agents, allowing users to "look back" at why a specific trade was made.
Vector Store (RAG Persistence)
For the Sentiment, Technical, and Fundamental agents to make informed decisions, the system persists document embeddings in a vector database.
- Domain Segregation: Documents are indexed into specific namespaces (e.g.,
technical,sentiment,fundamental,risk). - Indexing Workflow: The
KnowledgeIndexerprocesses Markdown and Text files from theknowledge/directory, chunks them, and updates the vector store.
# Example: Indexing domain-specific knowledge
from agents.rag.indexer import KnowledgeIndexer
indexer = KnowledgeIndexer()
stats = indexer.index_all()
print(f"Indexed {stats['technical']} chunks for the Technical Agent.")
Data Schemas
The system uses Pydantic models to bridge the gap between LLM outputs and database records. This ensures that every agent's recommendation follows a strict schema before being persisted.
Agent Response Schema
All trading agents must return data conforming to the AgentRecommendation model. This structure is what eventually populates the reasoning column in the database.
| Field | Type | Description |
| :--- | :--- | :--- |
| recommendation | Enum | BUY, SELL, or HOLD |
| confidence | Integer | 0-100 score of the agent's certainty |
| reasoning | String | The detailed logic behind the decision |
| key_factors | List[str] | Top 5 data points driving the signal |
File-Based Persistence
In addition to the database, the system generates standalone session reports in the logs/ directory.
Backtest Reports
Each backtest run generates a backtest_report_[TIMESTAMP].json file. This acts as a portable snapshot of the simulation, containing the full trade log and equity curve.
You can utility the view_trades.py script to inspect these files directly from the CLI:
# View the most recent backtest trade log
python view_trades.py
API Access & Integration
The Dashboard API provides a structured interface to interact with the persisted data. Developers can query these endpoints to retrieve historical performance.
Endpoint Summary
GET /api/backtests: Returns a list of all historical backtest summaries.GET /api/backtests/{id}: Returns a detailed report, including the fullequity_curveand associatedtrades.WS /ws: A WebSocket connection that streams real-time trade persistence events to the frontend.
Example: Fetching Trade History
// Example frontend fetch for a specific backtest
const response = await axios.get('http://localhost:8000/api/backtests/1');
const { trades, equity_curve } = response.data;
console.log(`Backtest #1 had ${trades.length} trades.`);
Configuration
Database connection strings and API keys for external data connectors (like NewsAPI) are managed via environment variables.
DATABASE_URL: Connection string for SQLAlchemy (defaults to SQLite for local development).NEWSAPI_KEY: Required for fetching supplemental market news for the Sentiment Agent.