6.2 KiB
Multi-Agent Crypto Analysis System — Architecture Status Report
Purpose
This document summarizes the current state of the system, what has already been built, and what components remain to be implemented. It serves as a quick re-orientation reference for future development.
1. System Objective
Build a fully local multi-agent system capable of generating structured financial reports on crypto projects.
Core goals:
- Fully local inference
- Modular agent design
- Deterministic infrastructure layers
- Heavy reasoning isolated to a larger model
- Parallelizable agent architecture
- Extensible data sources
2. Hardware Layout
Host A — Transformation Layer
GPU: RTX 3070 Model: Qwen 3.5 9B
Responsibilities:
- Aggregation
- Data normalization
- Structural parsing
- JSON output generation
These agents perform lightweight deterministic transformations.
Host B — Reasoning Layer
GPU: RTX 3090 Model: OSS GPT 20B
Responsibilities:
- High-level reasoning
- Cross-source synthesis
- Narrative report generation
- Strategic interpretation
This is the only reasoning layer.
Orchestration Node
Platform: Proxmox VM CPU: 8 threads
Responsibilities:
- OpenClaw orchestration
- Agent coordination
- Workflow execution
- Scheduling (future)
No heavy inference runs here.
Operators
1. url-operator (link discovery operator)
Purpose:
Retrieves a web page and returns a list of links and their respective categories in JSON format.
Capabilities:
- Fetch a single webpage
- Extract hyperlinks
- Normalize URLs
- Deduplicate links
- Link analysis
- Normalize URLs
- Categorizes the links
Link categories:
- GitHub
- Twitter/X
- Documentation
- Website
- Other
Constraints:
- No crawling
- No following links
Status: Running
2. twitter-operator
Purpose:
Access a local Twitter scraping service.
Capabilities:
- Retrieve tweets
- Retrieve account data
- Return raw JSON
Constraints:
- No tweet interpretation
- No sentiment detection
- No ranking or summarization
Status: Running
3. rss-operator
Purpose:
Access the RSS scraping service.
Capabilities:
- List feeds
- Add feeds
- Remove feeds
- Retrieve stored entries
- Trigger manual fetch
Constraints:
- No news interpretation
- No ranking
- No topic filtering
Status: Running
4. github-operator
Purpose:
Interface with the GitHub scraper service.
Capabilities:
- Extract repository metrics
- Retrieve repository statistics
Metrics returned include:
- stars
- forks
- watchers
- open issues
- language
- license
- contributors
- releases
- latest commit date
Constraints:
- No evaluation of development activity
- No repository ranking
- No popularity inference
Status: Running
5. web-operator
Purpose:
Analyzes the links and decides if they are relevant to the project.
Capabilities:
- Link analysis
- Normalize URLs
- Deduplicate links
- Select links that are relevant to the project
- Of those of relevancy, crawls them and returns a summary of the content
Outputs:
- JSON structure containing relevant links + content
Status: Not built
Orchestrator
Data orchestrator
Responsibilities:
- Receives a categorized list of URLs
- Spawns the operators and prompts them the relevant information
- Awaits for their responses.
- Aggregate the reponses of all the operators and passes them to the analyst.
Constraints:
- No evaluation of content
- No summarization
Outputs:
- JSON structure of the responses of the operators
Status: Not built
6. Analysis layer
crypto_analyst
This is the core reasoning agent.
Responsibilities:
- consume the data orchestrator output
- correlate signals across sources
- evaluate engagement-weighted signals
- produce structured reports
Outputs:
- narrative analysis
- structured project reports
- signal interpretation
Capabilities:
- interpret meaning
- compare sources
- draw conclusions
- synthesize multi-source evidence
Model used:
OSS GPT 20B.
Status: Not built
7. Current Data Flow
Expected pipeline:
user request ---------------> link_discovery_operator (url operator)
|
|
V
rss_operator <------> data_orchestrator <------> web_operator
| | |
twitter_operator <----------| | |----------> github_operator
|
V
crypto_analyst
|
|
V
final report
Operators collect raw data. The analyst interprets it.
8. Determinism Rules
Operators and orchestration layers must satisfy:
- identical output for identical input
- no hidden loops
- no narrative text
- no random ordering
- no autonomous actions
This enables:
- reproducibility
- debugging
- caching
- parallel execution
9. Current Implementation Status
Infrastructure:
twitter_operator ✓ running
rss_operator ✓ running
github_operator ✓ running
link_discovery_operator ✓ running
web_operator ☐ not built
Orchestrators:
data_orchestrator ☐ not built
Analysis layer:
crypto_analyst ☐ not built
10. Immediate Next Steps
Priority order:
-
Implement operators
- web-operator
-
Implement orchestrators
- data-orchestrator
-
Define analyst input strategy
-
Implement crypto-analyst
-
Run full pipeline tests
12. Long-Term Extensions
Possible future additions:
- Discord operator
- Governance forum operator
- On-chain data operator
- Sentiment analysis modules
- Market data feeds
The architecture is designed to add sources without modifying the analyst core.
Summary
The infrastructure layer is complete, all four operators already running.
The next development phase focuses on the orchestrator layer followed by the analysis agent.
Once these components are implemented, the system will be capable of producing fully local multi-source crypto project reports.