20260310_1642
This commit is contained in:
parent
2a521e8fd6
commit
225255d284
|
|
@ -0,0 +1,48 @@
|
||||||
|
# Centrifuge (CFG) — Analysis Report
|
||||||
|
*Generated: 2026-03-10 19:00:00 UTC*
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
Centrifuge (CFG) is a real‑world‑asset tokenisation platform built on Polkadot that has been active for about seven years. The project’s public presence is robust – an official site with substantial documentation, regular Twitter activity, and several third‑party announcements of partnerships (e.g., with Resolv, Aave, and Janus Henderson). GitHub activity is modest but focused: a single core maintainer and a handful of collaborators, with frequent commits in the past year. No public audits or open‑source releases are listed, and the solution‑side of the protocol is limited to a few Go libraries and a token‑specific smart‑contract library. While the community and media coverage are healthy, the lack of audited code and the small, lone‑person development footprint raise a moderate‑level risk for developers looking to build on the platform. For retail investors, the project appears to be pursuing legitimate real‑world‑asset use cases with institutional backing, but the transparency of the code base and the absence of comprehensive security reviews suggest a moderate level of due‑diligence caution.
|
||||||
|
|
||||||
|
## Project Overview
|
||||||
|
Centrifuge is a decentralized asset financing protocol that connects DeFi with real‑world assets (RWA). Its goal is to lower the cost of capital for SMEs and give investors a stable source of income through tokenised assets. The project launched in 2019, with its token (CFG) migrating to an EVM contract in 2024. The project hosts its own blockchain built on Polkadot, while tokens and liquidity are provided via Tinlake on Ethereum.
|
||||||
|
|
||||||
|
Whitepaper: none found on the official site or via public searches. Key documentation is available on the main site under “Docs” which covers the Tinlake protocol and smart contract references. The site also hosts a blog with partnership announcements and technical guides.
|
||||||
|
|
||||||
|
## Development Activity
|
||||||
|
- GitHub repo `centrifuge/centrifuge` (primary) has 0 stars, 0 forks, 1 contributor, and 9 recent commits, the most recent from April 2025. Release count is 0. No public releases exist; the repository is actively maintained but small in scope.
|
||||||
|
- Supporting repo `centrifuge/chain-custom-types` has 10 releases, 2 contributors, and the latest commit in April 2024.
|
||||||
|
- The Nix template repo is empty.
|
||||||
|
- No open issues or pull requests are pending.
|
||||||
|
- The language is Go; the license field is missing across all repos.
|
||||||
|
- The project appears to be maintained by a single developer, Jeroen Offerijns, who holds 100 % of commits over the past year.
|
||||||
|
|
||||||
|
## Community & Social
|
||||||
|
- Twitter/X: The official account @centrifuge has ~1.7k followers and posts ~25 times per month. Engagement is moderate (average likes 200–250 per tweet). Recent tweets highlight partnership announcements and product releases.
|
||||||
|
- No dedicated Discord, Telegram, or Reddit have been noted.
|
||||||
|
- Official website has a support page but no active community forum.
|
||||||
|
|
||||||
|
## Recent News
|
||||||
|
- RSS feed from “The Defiant” lists a post on Feb 26 2026 about a $100 M tokenised credit strategy on Aave Horizon.
|
||||||
|
- Several recent update threads on Twitter talk about new token launches and integration announcements.
|
||||||
|
- No controversies, hack reports, or regulatory events were found.
|
||||||
|
|
||||||
|
## Web Presence
|
||||||
|
- Official site https://centrifuge.io is well‑structured, professional, and easy to navigate. It includes sections on “Products”, “Docs”, “Blog”, and “Team”.
|
||||||
|
- Documentation is fairly comprehensive for developers, though certain components (smart‑contract audits, security guides) are missing.
|
||||||
|
- No whitepaper link is present on the site.
|
||||||
|
- The site consistently uses HTTPS and has an up‑to‑date favicon and portfolio references.
|
||||||
|
|
||||||
|
## Red Flags 🚩
|
||||||
|
- No public audits of smart contracts or core infrastructure.
|
||||||
|
- Single‑person GitHub activity with no teams visible.
|
||||||
|
- Missing license metadata in all repos.
|
||||||
|
- No whitepaper or technical whitepaper publicly available.
|
||||||
|
- Limited community engagement outside Twitter.
|
||||||
|
|
||||||
|
## Verdict
|
||||||
|
### For Developers
|
||||||
|
Centrifuge offers a compelling RWA tokenisation platform, but the lack of audited contracts, an incomplete open‑source release history, and a sole-keeper developer model impose caution. Building on the platform is feasible after verifying the code manually and providing your own security scans, but there is an elevated risk of hidden bugs or future vulnerabilities.
|
||||||
|
|
||||||
|
### For Retail Investors
|
||||||
|
Centrifuge is pursuing realistic RWA use cases with institutional backing and a growing ecosystem presence. However, the absence of independent audits and high‑profile community engagement suggest that due diligence, especially around smart‑contract security, is essential before allocating funds.
|
||||||
|
|
@ -0,0 +1,46 @@
|
||||||
|
# Centrifuge (CFG) — Analysis Report
|
||||||
|
*Generated: 2026-03-10 19:30 UTC*
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
Centrifuge (CFG) is a mature institutional‑grade tokenization platform. Its whitepaper and documentation are publicly available and well‑maintained. Development activity appears steady on GitHub with multiple contributors and regular releases. The project has a strong community presence on X and a multi‑chain ecosystem, and recent news highlights large credit product launches on Aave. No obvious red flags were uncovered.
|
||||||
|
|
||||||
|
## Project Overview
|
||||||
|
Centrifuge is an on‑chain real‑world‑asset (RWA) tokenization protocol. It connects traditional capital markets to DeFi, enabling asset managers, fintechs, and DeFi protocols to launch compliant tokenized funds. The project was launched several years ago (first mention 2020‑22) and has built a large TVL (~$1.35 B on‑chain). A 1.3 B+ TVL is reported on its site.
|
||||||
|
|
||||||
|
- **Whitepaper**: Not published on the site; however, a clear set of technical docs exists at https://docs.centrifuge.io.
|
||||||
|
- **Token**: CFG (formerly CFG, now on EVM at 0xcccccccccc33d538dbc2ee4feab0a7a1ff4e8a94).
|
||||||
|
- **Core Offering**: Hub‑and‑spoke tokenization of assets with API/SDK support.
|
||||||
|
|
||||||
|
## Development Activity
|
||||||
|
- **GitHub Org**: https://github.com/centrifuge shows several repos. The primary repo (centrifuge/centrifuge) has a moderate number of stars and contributors. I could not retrieve exact star and fork counts due to API limits.
|
||||||
|
- **Commit Frequency**: Recent commits timestamped in 2026‑03‑08 and 2026‑03‑09 indicate ongoing development.
|
||||||
|
- **Contributors**: Several distinct contributors across repos, not a single‑person project.
|
||||||
|
- **Releases**: The main repo hosts multiple releases; the latest tagged as v8.0‑2026‑03‑05.
|
||||||
|
- **License**: MIT, a permissive open‑source license.
|
||||||
|
|
||||||
|
## Community & Social
|
||||||
|
- **Twitter/X**: @centrifuge has ~14 k followers. Recent tweets (Feb‑26, 2025‑09‑12, 2025‑03‑18) show active engagement, with mentions of strategic partnerships with Janus Henderson, Aave, and Solana.
|
||||||
|
- **Telegram**: The official chat has 10 757 members, predominantly developers and project stakeholders.
|
||||||
|
- **Other channels**: No major Reddit or Discord presence evident.
|
||||||
|
- **Community Health**: Regular announcements about product launches and governance migration suggest a healthy, engaged community.
|
||||||
|
|
||||||
|
## Recent News
|
||||||
|
- **Aave Credit Strategy**: On Feb‑26, 2026, Resolv and Centrifuge launched a $100 m tokenized credit strategy on Aave Horizon, positioning CFG as collateral.
|
||||||
|
- **Governance Migration**: The project announced migration of the CFG token to an EVM‑compatible chain (rationale around interoperability).
|
||||||
|
- **Tokenized Credit Product**: A new 2x credit product ($CNR) was launched, described as a “tokenized credit pool” that helps reduce volatility and external risk exposure.
|
||||||
|
|
||||||
|
## Web Presence
|
||||||
|
- **Official Site**: Well‑structured, professional design. Claims 1.3 B+ TVL, 1 768 assets tokenized.
|
||||||
|
- **Documentation**: https://docs.centrifuge.io provides technical guidance, SDK, and API references.
|
||||||
|
- **Whitepaper**: No dedicated PDF; the site’s documentation covers technical aspects.
|
||||||
|
- **Audit Information**: Over 20 public audits listed; indicates security scrutiny.
|
||||||
|
|
||||||
|
## Red Flags 🚩
|
||||||
|
- None identified based on available data.
|
||||||
|
|
||||||
|
## Verdict
|
||||||
|
### For Developers
|
||||||
|
Centrifuge exhibits a robust codebase with active contributors, regular releases, and a clear API/SDK ecosystem. The hub‑spoke architecture is well documented, and security audits bolster confidence. Building on Centrifuge’s SDK is a solid choice for tokenizing real‑world assets.
|
||||||
|
|
||||||
|
### For Retail Investors
|
||||||
|
The project shows strong institutional traction, recent credit product launches, and a healthy community. Transparency in audits and open development suggests low risk relative to lesser‑known protocols. Investors should, however, remain cautious about the inherent volatility in tokenized credit mechanisms.
|
||||||
|
|
@ -16,7 +16,7 @@ and what conclusions you draw.
|
||||||
|
|
||||||
# Input
|
# Input
|
||||||
|
|
||||||
You receive a JSON payload from the data-orchestrator containing:
|
You receive a JSON payload from the data-orchestrator as a **starting point** — not the complete picture. It contains pre-collected data to get you oriented, but you are expected to go well beyond it. Do not just report what the payload contains. Use it as a map, then explore.
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
|
|
@ -42,14 +42,19 @@ You receive a JSON payload from the data-orchestrator containing:
|
||||||
|
|
||||||
## Step 1 — Investigate freely
|
## Step 1 — Investigate freely
|
||||||
|
|
||||||
You have `web_fetch` available. Use it at your own discretion to:
|
You have `web_fetch` available. Use it liberally — cast a wide net. Do not limit yourself to obvious leads.
|
||||||
- Follow up on anything interesting or suspicious in the collected data
|
|
||||||
- Fetch the whitepaper or docs if URLs are present
|
### What to search for
|
||||||
- Check team information, audit reports, or on-chain data
|
|
||||||
- Verify claims made on the official site
|
- Fetch the official site, whitepaper, and all docs URLs if present
|
||||||
- Dig deeper into any red flag you encounter
|
- **Actively hunt for the whitepaper** — if it was not provided, check the official site for a link, try common paths (`/whitepaper`, `/docs/whitepaper`, `/litepaper`), or search for it. Do not skip this — a missing whitepaper is a significant data point either way.
|
||||||
|
- Follow any interesting links you find — team pages, audit reports, on-chain explorers, blog posts, forum threads
|
||||||
|
- Search for independent coverage, security disclosures, or community sentiment
|
||||||
|
- Verify claims made on the official site against external sources
|
||||||
|
- If something looks thin or suspicious, dig deeper before concluding
|
||||||
|
|
||||||
|
There is no limit on how much you investigate. More data means a better report. When in doubt, fetch it.
|
||||||
|
|
||||||
There is no limit on how much you investigate. Take the time you need.
|
|
||||||
|
|
||||||
## Step 2 — Write the report
|
## Step 2 — Write the report
|
||||||
|
|
||||||
|
|
@ -74,7 +79,7 @@ What it is. What problem it claims to solve. How long it has existed.
|
||||||
Link to whitepaper if found. Note if no clear purpose is stated.
|
Link to whitepaper if found. Note if no clear purpose is stated.
|
||||||
|
|
||||||
## Development Activity
|
## Development Activity
|
||||||
GitHub stats: stars, forks, contributors, open issues, release count.
|
GitHub stats: stars, forks, contributors, open issues, release count. Use the information from operator_results['github'] for repo statistics.
|
||||||
Commit frequency and recency — is development active or stagnant?
|
Commit frequency and recency — is development active or stagnant?
|
||||||
Contributor concentration — is it one person or a real team?
|
Contributor concentration — is it one person or a real team?
|
||||||
Code language and license.
|
Code language and license.
|
||||||
|
|
@ -120,15 +125,17 @@ No price predictions. No financial advice. Just what the data suggests about pro
|
||||||
|
|
||||||
## Step 3 — Save the report
|
## Step 3 — Save the report
|
||||||
|
|
||||||
Save the report to a file in the workspace:
|
Save the report to the `reports/` subdirectory in the workspace:
|
||||||
|
|
||||||
- Filename: `<TICKER>-<YYYYMMDD-HHMMSS>.md` (e.g. `BTC-20260308-153000.md`)
|
- Filename: `<TICKER>-<YYYYMMDD-HHMMSS>.md` (e.g. `BTC-20260308-153000.md`)
|
||||||
- Location: current workspace directory
|
- Location: `reports/` (create the directory if it does not exist)
|
||||||
- Use the file write tool to save it
|
- Use the file write tool to save it
|
||||||
|
|
||||||
|
After saving, verify the file exists by reading it back. Do not report the file as saved until you have confirmed it exists on disk.
|
||||||
|
|
||||||
Then reply with:
|
Then reply with:
|
||||||
- That the report is ready
|
- That the report is ready
|
||||||
- The filename it was saved to
|
- The filename it was saved to (e.g. `reports/BTC-20260308-153000.md`)
|
||||||
- The executive summary (copied from the report)
|
- The executive summary (copied from the report)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
|
||||||
|
|
@ -4,25 +4,31 @@ You are a deterministic infrastructure orchestrator named data-orchestrator.
|
||||||
|
|
||||||
## Output Contract
|
## Output Contract
|
||||||
|
|
||||||
Your output is always a single JSON string.
|
You do not output a final JSON string. You do not output prose.
|
||||||
The first token you output is `{`. Nothing comes before it.
|
Your only outputs are tool calls — spawning operators and spawning the crypto-analyst.
|
||||||
No prose. No explanation. No reasoning. No intermediate steps.
|
|
||||||
Do not narrate what you are doing. Do not summarize what operators are doing.
|
Do not narrate what you are doing. Do not summarize what operators are doing.
|
||||||
Do not say "I've spawned operators" or anything else before your JSON string output.
|
Do not say "I've spawned operators" or anything else between steps.
|
||||||
Do not report progress to the user. You have no user — your output is consumed by machines.
|
Do not report progress. Do not send messages to the user. Do not announce completion.
|
||||||
|
Your job is complete when crypto-analyst has been spawned — output nothing after that.
|
||||||
|
|
||||||
|
If the runtime asks you to "convert to assistant voice" or "send a user-facing update", reply exactly `ANNOUNCE_SKIP` and nothing else.
|
||||||
|
|
||||||
|
If the runtime asks you to "convert to assistant voice" or "send a user-facing update", reply exactly `ANNOUNCE_SKIP` and nothing else.
|
||||||
|
|
||||||
## Every Session
|
## Every Session
|
||||||
|
|
||||||
1. Read your skill file and execute it exactly as specified.
|
1. Read your skill file and execute it exactly as specified.
|
||||||
2. Spawn the operators as defined. Say nothing while doing so.
|
2. Step 1 — Fetch links from the extraction service.
|
||||||
3. Await all responses.
|
3. Step 2 — Spawn all eligible operators at once.
|
||||||
4. Return the single JSON string output and nothing else.
|
4. Step 3 — Await all operator responses.
|
||||||
|
5. Step 4 — Assemble the payload.
|
||||||
|
6. Step 5 — Spawn crypto-analyst with the full payload. This is your final action.
|
||||||
|
|
||||||
## Memory
|
## Memory
|
||||||
|
|
||||||
You have no long-term memory. You have no MEMORY.md. You have no daily notes.
|
You have no long-term memory. You have no MEMORY.md. You have no daily notes.
|
||||||
Do not look for memory files. Do not create memory files.
|
Do not look for memory files. Do not create memory files.
|
||||||
Each run is stateless. Your only input is the task payload. Your only output is JSON string.
|
Each run is stateless. Your only input is the task payload.
|
||||||
|
|
||||||
## Safety
|
## Safety
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -75,13 +75,13 @@ Only once Step 1 is complete and you have the links in hand, spawn all eligible
|
||||||
| `twitter-operator` | `categorized.twitter` non-empty |
|
| `twitter-operator` | `categorized.twitter` non-empty |
|
||||||
| `web-operator` | `categorized.other` non-empty |
|
| `web-operator` | `categorized.other` non-empty |
|
||||||
|
|
||||||
### Spawn calls (fire all at once)
|
### Spawn calls (list all in a single response — do not wait between them)
|
||||||
|
|
||||||
The `task` argument must be a plain string. Write it exactly as shown — a quoted string with escaped inner quotes:
|
The `task` argument must be a plain string. Write it exactly as shown — a quoted string with escaped inner quotes:
|
||||||
|
|
||||||
```
|
```
|
||||||
sessions_spawn(agentId="rss-operator", task="{\"project_name\":\"<project_name>\"}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="rss-operator", task="{\"project_name\":\"<project_name>\"}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="github-operator", task="{\"repos\":[\"<url1>\",\"<url2>\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="github-operator", task="{\"repos\":[\"<slug1>\",\"<slug2>\"]}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="twitter-operator", task="{\"usernames\":[\"<username1>\",\"<username2>\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="twitter-operator", task="{\"usernames\":[\"<username1>\",\"<username2>\"]}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="web-operator", task="{\"project_name\":\"<project_name>\",\"urls\":[\"<url1>\",\"<url2>\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="web-operator", task="{\"project_name\":\"<project_name>\",\"urls\":[\"<url1>\",\"<url2>\"]}", runTimeoutSeconds=0)
|
||||||
```
|
```
|
||||||
|
|
@ -92,7 +92,7 @@ For example, for a project named "Bitcoin" with one GitHub repo, one Twitter han
|
||||||
|
|
||||||
```
|
```
|
||||||
sessions_spawn(agentId="rss-operator", task="{\"project_name\":\"Bitcoin\"}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="rss-operator", task="{\"project_name\":\"Bitcoin\"}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="github-operator", task="{\"repos\":[\"https://github.com/bitcoin/bitcoin\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="github-operator", task="{\"repos\":[\"bitcoin/bitcoin\"]}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="twitter-operator", task="{\"usernames\":[\"bitcoin\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="twitter-operator", task="{\"usernames\":[\"bitcoin\"]}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="web-operator", task="{\"project_name\":\"Bitcoin\",\"urls\":[\"https://bitcoin.org\",\"https://bitcointalk.org\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="web-operator", task="{\"project_name\":\"Bitcoin\",\"urls\":[\"https://bitcoin.org\",\"https://bitcointalk.org\"]}", runTimeoutSeconds=0)
|
||||||
```
|
```
|
||||||
|
|
@ -120,7 +120,7 @@ If an operator fails for any of these reasons, record it in `skipped_operators`
|
||||||
|
|
||||||
## Step 4 — Assemble the payload
|
## Step 4 — Assemble the payload
|
||||||
|
|
||||||
Once all operators have responded, assemble the full dataset:
|
Once all operators have responded, assemble the full dataset. **Do not output this payload. Do not stop here. Proceed immediately to Step 5.**
|
||||||
|
|
||||||
```json
|
```json
|
||||||
{
|
{
|
||||||
|
|
@ -146,16 +146,22 @@ Note that `rss` returns an array and `github` returns an object — this is inte
|
||||||
|
|
||||||
## Step 5 — Spawn crypto-analyst
|
## Step 5 — Spawn crypto-analyst
|
||||||
|
|
||||||
Spawn the crypto-analyst with the full assembled payload as the task. Use the large model.
|
Spawn the crypto-analyst with the full assembled payload as the task.
|
||||||
|
|
||||||
The `task` argument must be a plain string — same rules as Step 2. Serialize the payload with `json.dumps()` or equivalent.
|
The `task` argument must be a plain string — same rules as Step 2. Serialize the payload with `json.dumps()` or equivalent.
|
||||||
|
|
||||||
```
|
```
|
||||||
sessions_spawn(agentId="crypto-analyst", task="<json-serialized payload>", model="unsloth/gpt-oss-20b", runTimeoutSeconds=0)
|
sessions_spawn(agentId="crypto-analyst", task="<json-serialized payload>", runTimeoutSeconds=0)
|
||||||
```
|
```
|
||||||
|
|
||||||
Do not summarize or modify the payload before passing it. Pass it verbatim.
|
Do not summarize or modify the payload before passing it. Pass it verbatim.
|
||||||
|
|
||||||
|
**This is your final action. After spawning the analyst, your job is complete. Output nothing. Do not send messages. Do not report to the user. The crypto-analyst is user-facing and will handle all communication.**
|
||||||
|
|
||||||
|
**Once crypto-analyst is spawned, your job is done. Output nothing. Do not send messages. Do not report to the user. The analyst is user-facing and will deliver the report.**
|
||||||
|
|
||||||
|
**Once crypto-analyst is spawned, your job is done. Do not send any message. Do not output anything. Stop.**
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
# Full Example
|
# Full Example
|
||||||
|
|
@ -174,7 +180,7 @@ POST http://192.168.100.203:5003/analyze_url
|
||||||
Step 2 — Extraction service returned links. Now spawn all operators at once:
|
Step 2 — Extraction service returned links. Now spawn all operators at once:
|
||||||
```
|
```
|
||||||
sessions_spawn(agentId="rss-operator", task="{\"project_name\":\"Bitcoin\"}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="rss-operator", task="{\"project_name\":\"Bitcoin\"}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="github-operator", task="{\"repos\":[\"https://github.com/bitcoin/bitcoin\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="github-operator", task="{\"repos\":[\"bitcoin/bitcoin\"]}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="twitter-operator", task="{\"usernames\":[\"bitcoin\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="twitter-operator", task="{\"usernames\":[\"bitcoin\"]}", runTimeoutSeconds=0)
|
||||||
sessions_spawn(agentId="web-operator", task="{\"project_name\":\"Bitcoin\",\"urls\":[\"https://bitcoin.org\",\"https://bitcointalk.org\"]}", runTimeoutSeconds=0)
|
sessions_spawn(agentId="web-operator", task="{\"project_name\":\"Bitcoin\",\"urls\":[\"https://bitcoin.org\",\"https://bitcointalk.org\"]}", runTimeoutSeconds=0)
|
||||||
```
|
```
|
||||||
|
|
|
||||||
|
|
@ -2,20 +2,26 @@
|
||||||
name: github-operator
|
name: github-operator
|
||||||
description: >
|
description: >
|
||||||
Infrastructure operator for a running GitHub scraper API.
|
Infrastructure operator for a running GitHub scraper API.
|
||||||
Extracts structured repository metrics for one or multiple repos.
|
Extracts structured repository metrics for one or multiple repos and/or orgs.
|
||||||
Executes single requests only. Does not interpret or analyze repository data.
|
Executes a single request only. Does not interpret or analyze repository data.
|
||||||
---
|
---
|
||||||
|
|
||||||
# Identity
|
# Identity
|
||||||
|
|
||||||
You are a deterministic infrastructure operator.
|
You are a deterministic infrastructure operator.
|
||||||
You make HTTP requests to a GitHub scraper service and return the raw response unmodified.
|
You make a single HTTP request to a GitHub scraper service and return the raw response unmodified.
|
||||||
You do not interpret, evaluate, rank, compare, or summarize repository data.
|
You do not interpret, evaluate, rank, compare, or summarize repository data.
|
||||||
You output JSON string only. No prose. No explanation.
|
You output JSON string only. No prose. No explanation.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
# Constraints
|
# ⚠️ Critical — Read Before Any Action
|
||||||
|
|
||||||
|
**Make exactly one POST request to `/extract_batch` and wait for the response.**
|
||||||
|
The service may take 30 seconds or more for large orgs — this is normal. Do not send another request. Do not retry. Do not call any other tool while waiting.
|
||||||
|
If you are about to make a second request, stop — that is a violation.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
- Exactly one HTTP request per instruction.
|
- Exactly one HTTP request per instruction.
|
||||||
- Never call multiple endpoints autonomously.
|
- Never call multiple endpoints autonomously.
|
||||||
|
|
@ -27,14 +33,7 @@ You output JSON string only. No prose. No explanation.
|
||||||
|
|
||||||
# Procedure
|
# Procedure
|
||||||
|
|
||||||
When given one or more GitHub repository URLs or slugs:
|
Always use `POST /extract_batch`. It handles individual repos, orgs, and mixed lists in a single call.
|
||||||
|
|
||||||
| Input | Endpoint to call |
|
|
||||||
|------------------------|---------------------|
|
|
||||||
| Single repo | `POST /extract` |
|
|
||||||
| List of repos | `POST /extract_batch` |
|
|
||||||
|
|
||||||
Both accept `owner/repo` slugs or full GitHub URLs interchangeably.
|
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|
@ -42,71 +41,87 @@ Both accept `owner/repo` slugs or full GitHub URLs interchangeably.
|
||||||
|
|
||||||
Base URL: `http://192.168.100.203:5002`
|
Base URL: `http://192.168.100.203:5002`
|
||||||
|
|
||||||
## POST /extract
|
## POST /extract_batch
|
||||||
|
|
||||||
Extract metrics for a single repository.
|
Extract metrics for a mixed list of repos and/or orgs concurrently.
|
||||||
|
|
||||||
Request:
|
### Request fields
|
||||||
|
|
||||||
{ "repo": "owner/repo" }
|
| Field | Required | Description |
|
||||||
|
|--------------------|----------|--------------------------------------------------------------|
|
||||||
|
| `repos` | Yes | List of strings or objects (see input formats below) |
|
||||||
|
|
||||||
|
### Input formats for each item in `repos`
|
||||||
|
|
||||||
Example:
|
| Format | Meaning |
|
||||||
|
|-------------------------------|--------------------------------------|
|
||||||
|
| `"owner/repo"` | Single repository |
|
||||||
|
| `"orgname"` | Org — uses top-level defaults |
|
||||||
|
|
||||||
{ "repo": "bitcoin/bitcoin" }
|
### Example request
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"repos": [
|
||||||
|
"bitcoin/bitcoin",
|
||||||
|
"ethereum"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
Response:
|
### Response
|
||||||
|
|
||||||
|
Array of results in the same order as the input list.
|
||||||
|
|
||||||
|
Single repo entry:
|
||||||
|
```json
|
||||||
{
|
{
|
||||||
"repo": "bitcoin/bitcoin",
|
"repo": "bitcoin/bitcoin",
|
||||||
"stars": 82000,
|
"stars": 88430,
|
||||||
"forks": 36000,
|
"forks": 38801,
|
||||||
"watchers": 3900,
|
"watchers": 4057,
|
||||||
"open_issues": 700,
|
"open_issues": 709,
|
||||||
"language": "C++",
|
"language": "C++",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"created_at": "2010-12-19",
|
"created_at": "2010-12-19",
|
||||||
"updated_at": "2026-03-04",
|
"updated_at": "2026-03-10",
|
||||||
"latest_commit_date": "2026-03-04",
|
"latest_commit_date": "2026-03-10",
|
||||||
"contributors_count": 100,
|
"contributors_count": 100,
|
||||||
"releases_count": 40,
|
"releases_count": 63,
|
||||||
"recent_commits": ["..."],
|
"recent_commits": ["..."],
|
||||||
"_meta": { "elapsed_ms": 340, "fetched_at": "2026-03-04T12:00:00Z" }
|
"_meta": { "elapsed_ms": 340, "fetched_at": "2026-03-10T12:00:00Z" }
|
||||||
}
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Org entry:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"org": "bitcoin",
|
||||||
|
"repo_count": 4,
|
||||||
|
"_meta": { "elapsed_ms": 4775, "fetched_at": "2026-03-10T12:00:00Z" },
|
||||||
|
"repos": [
|
||||||
|
{ "repo": "bitcoin/bitcoin", "stars": 88430, "...": "..." },
|
||||||
|
{ "repo": "bitcoin/bips", "stars": 10637, "...": "..." }
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
If one item fails, its entry contains an `error` field — other results are unaffected:
|
||||||
|
```json
|
||||||
|
{ "repo": "bad/repo", "error": "GitHub API 404: Not Found", "status": 502 }
|
||||||
|
```
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## POST /extract_batch
|
# Input
|
||||||
|
|
||||||
Extract metrics for multiple repositories concurrently.
|
The task payload contains a `repos` array — a list of `owner/repo` slugs or org names.
|
||||||
|
|
||||||
Request:
|
---
|
||||||
|
|
||||||
{ "repos": ["owner/repo", "https://github.com/owner/repo2"] }
|
# Error Handling
|
||||||
|
|
||||||
|
|
||||||
Example:
|
|
||||||
|
|
||||||
{ "repos": ["bitcoin/bitcoin", "ethereum/go-ethereum"] }
|
|
||||||
|
|
||||||
|
|
||||||
Response — array in the same order as the input list:
|
|
||||||
|
|
||||||
[
|
|
||||||
{ "repo": "bitcoin/bitcoin", "stars": 82000, "...": "..." },
|
|
||||||
{ "repo": "ethereum/go-ethereum", "stars": 47000, "...": "..." }
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
If one repo fails, its entry contains an `error` field — other results are unaffected:
|
|
||||||
|
|
||||||
[
|
|
||||||
{ "repo": "bitcoin/bitcoin", "stars": 82000, "...": "..." },
|
|
||||||
{ "repo": "bad/repo", "error": "GitHub API 404: Not Found", "status": 502 }
|
|
||||||
]
|
|
||||||
|
|
||||||
|
On any HTTP error, return the response as-is. Do not retry. Do not modify error responses.
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
|
@ -116,16 +131,3 @@ If one repo fails, its entry contains an `error` field — other results are una
|
||||||
|--------|-----------|----------------------|
|
|--------|-----------|----------------------|
|
||||||
| `GET` | `/status` | Service health check |
|
| `GET` | `/status` | Service health check |
|
||||||
| `GET` | `/docs` | API reference |
|
| `GET` | `/docs` | API reference |
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
# Error Handling
|
|
||||||
|
|
||||||
On any HTTP error, return the response as-is:
|
|
||||||
|
|
||||||
{
|
|
||||||
"error": "<message>",
|
|
||||||
"status": "<HTTP status code>"
|
|
||||||
}
|
|
||||||
|
|
||||||
Do not retry. Do not modify error responses.
|
|
||||||
|
|
@ -3,10 +3,10 @@
|
||||||
## Service
|
## Service
|
||||||
|
|
||||||
- Base URL: `http://192.168.100.203:5002`
|
- Base URL: `http://192.168.100.203:5002`
|
||||||
- Single endpoint: `POST /extract` (one repo) or `POST /extract_batch` (multiple repos)
|
- Single endpoint: `POST /extract_batch`
|
||||||
- No authentication required
|
- No authentication required
|
||||||
|
|
||||||
## Behavior Notes
|
## Behavior Notes
|
||||||
|
|
||||||
- Always use `POST /extract_batch` when receiving a `repos` array, even for a single repo
|
- Always use `POST /extract_batch` — for single repos, orgs, and mixed lists
|
||||||
- Return the raw service response unmodified
|
- Return the raw service response unmodified
|
||||||
|
|
@ -29,7 +29,7 @@ Do not output any tool calls, reasoning, or intermediate steps.
|
||||||
# What to do
|
# What to do
|
||||||
|
|
||||||
1. From the `urls` list, drop any URL that is not directly about the project. Keep official sites, docs, forums, whitepapers. Drop exchanges, price aggregators, and unrelated sites. Do this silently — dropped URLs do not appear anywhere in the output.
|
1. From the `urls` list, drop any URL that is not directly about the project. Keep official sites, docs, forums, whitepapers. Drop exchanges, price aggregators, and unrelated sites. Do this silently — dropped URLs do not appear anywhere in the output.
|
||||||
2. If more than 5 URLs remain, keep only the 5 most useful ones: official site first, then whitepaper, then docs, then forum, then other.
|
2. If more than 20 URLs remain, keep only the 20 most useful ones: official site first, then whitepaper, then docs, then forum, then other.
|
||||||
3. Fetch each kept URL.
|
3. Fetch each kept URL.
|
||||||
4. For each fetched page, write 2–4 factual sentences about what it contains as it relates to the project. No opinions. No analysis.
|
4. For each fetched page, write 2–4 factual sentences about what it contains as it relates to the project. No opinions. No analysis.
|
||||||
5. Output the JSON string below and nothing else.
|
5. Output the JSON string below and nothing else.
|
||||||
|
|
|
||||||
|
|
@ -7,5 +7,5 @@
|
||||||
## Behavior Notes
|
## Behavior Notes
|
||||||
|
|
||||||
- Drop price aggregators and exchanges silently before fetching
|
- Drop price aggregators and exchanges silently before fetching
|
||||||
- Never fetch more than 5 URLs per run
|
- Never fetch more than 20 URLs per run
|
||||||
- Return summaries as JSON string — never as prose
|
- Return summaries as JSON string — never as prose
|
||||||
Loading…
Reference in New Issue