Instructions for Testing
Onboarding Instructions for Building DeFi Agents
Welcome to the alpha testing program for building DeFi AI pipelines!
Your mission is to develop a working DeFi pipeline that delivers functional insights for one of two key areas: On-Chain Data Analysis or Market Analysis & Prediction.
Use the guidelines below...
1. Choose Your Use Case
You must choose one of the following domains for your pipeline:
On-Chain Data Analysis
• Focus on extracting, processing, and summarizing data directly from blockchain networks.
• Potential functionalities include transaction trend analysis, protocol usage stats, and anomaly detection.
Market Analysis & Prediction
• Focus on aggregating market data (prices, volumes, social sentiment) and using that data to forecast trends.
• Ensure your pipeline includes clear reasoning behind any predictive analytics or recommendations.
Note: While your primary focus should be on one domain, you are welcome to include secondary features if they add value to the end user.
2. Data Sources and Integration
Required Data Source Categories:
On-Chain Data: Use blockchain explorers, subgraphs, or other on-chain APIs.
Off-Chain Data: Leverage price aggregators, social media sentiment analysis tools, or other market data sources.
Guidelines:
Reference Material: Detailed dataset examples and API documentation are provided in your resource bundle. Use these as a starting point, but feel free to integrate additional or alternative sources that meet your project’s objectives.
Data Reliability: Verify that the data you retrieve is accurate and up-to-date. Your pipeline should gracefully handle any potential data delays or retrieval failures.
3. Output Requirements
General Expectations:
Actionable Insights: Each pipeline must output clear, concise summaries or predictions that can inform real-world DeFi decisions.
Clarity in Explanations: For pipelines that include predictive models, document the logic and reasoning behind your predictions. This could be in the form of inline comments, a separate readme file, or a detailed summary report.
Format:
• Ensure that your outputs are easy to read and interpret. Use tables, charts, or bullet points where applicable.
• Consider providing both machine-readable (e.g., JSON, CSV) and human-readable formats.
4. Development Objectives
Accuracy & Reliability:
Robust Data Handling: Implement error-checking and fallback mechanisms to minimize failures or delays.
Testing: Validate your pipeline with different data sets and scenarios to ensure consistent performance.
Scalability:
Concurrency: Design your pipeline to handle high volumes of data and multiple concurrent queries.
Performance: Optimize data fetching and processing to support real-time or near-real-time insights.
Innovation:
Beyond the Basics: Don’t feel restricted to the provided examples. Explore novel approaches that solve real challenges for DeFi users.
User-Centric Solutions: Focus on features that improve decision-making for end users in a tangible way.
5. Submission and Feedback
Documentation: Include a clear README file that outlines your chosen use case, data sources, processing logic, and how to run your pipeline.
Demonstration: Provide sample outputs along with instructions on how to test or interact with your pipeline.
Feedback Loop: Be prepared to iterate on your design based on tester and developer feedback. Document any issues you encounter and your proposed fixes.
6. Additional Tips for Success
Plan Before You Code: Outline your design, identify necessary data endpoints, and draft the structure of your pipeline.
Keep It Modular: Build your solution in modular components to simplify debugging and future improvements.
Communicate Clearly: Use comments and documentation to make your code and logic easy for others to understand.
By following these detailed steps, you will help ensure that your DeFi pipeline is not only innovative and robust but also user-friendly and aligned with our program goals.
Thank you for your contributions, and happy testing!
Data Sources (References for Testers)
The following data sources are examples you can use to develop pipelines. These are references and not mandatory or fixed; feel free to explore additional sources.
A. On-Chain & Subgraph Data (Data Retrieval)
Block Explorers: Etherscan, BSCScan, Polygonscan.
Raw Blockchain Interactions: Token transfers, contract calls, gas fees, approvals (ERC-20, ERC-721, ERC-1155).
Whale Activity: Large wallet inflows/outflows.
Governance Activity: DAO proposals, voting records.
Historical Metrics: Gas fees and transaction latency under varying network conditions.
B. DeFi Analytics & Insights (Market & Price Data)
Price Aggregators: CoinGecko, CoinMarketCap, Binance API for real-time and historical token prices.
DeFi Protocol Data: APR/APY rates for lending, staking, and liquidity pools (e.g., Aave, Compound, Curve).
Market Sentiment Indicators: Social media sentiment, funding rates from perpetual swaps.
Derivatives Data: Funding rates, open interest, and options pricing for derivative and perpetual markets.
C. Multi-Source Data Merging (Large-Scale DeFi Transaction Records)
Liquidity Pool Metrics: Reserve balances, TVL (Total Value Locked) metrics (e.g., Uniswap, Balancer, Sushiswap).
Yield-Farming Data: Staking rewards and yield variations over different timeframes.
Flash Loan Data: Availability, utilization, and protocol-specific lending rates.
D. Scalability & Efficiency (Cross-Chain Activity & Bridging Data)
Bridge-Specific Data: Transaction history, fees, slippage, and liquidity analytics (e.g., LayerZero, Stargate, Hop Protocol).
Real-Time Monitoring: Decentralized bridge reserves and pool imbalances.
Reference Data Sources
The following platforms provide rich datasets for building DeFi pipelines. These are just examples; you can explore and integrate additional sources as needed:
L2Beat: Scaling solutions, metrics for Layer 2 protocols.
Artemis: Multi-chain analytics, TVL, and user activity metrics.
DeBank: Wallet tracking, DeFi portfolio aggregation.
CoinGlass: Market indicators, funding rates, derivatives data.
Santiment: Social sentiment, on-chain activity, and market metrics.
Token Terminal: Protocol financial metrics, revenues, and user data.
Nansen: On-chain analytics, wallet tracking, whale activity.
Messari: Crypto reports, research, and market data.
DeFiLlama: TVL metrics, protocol comparisons, multi-chain data.
CryptoQuant: Exchange flows, miner data, market indicators.
Dune Analytics: Custom dashboards and on-chain data analysis.
Basic Passing Criteria
A pipeline will PASS if it meets the following requirements:
1. Functionality & Execution
Data Retrieval: The pipeline can successfully fetch data from on-chain and off-chain sources, such as blockchain explorers, DEX aggregators, and yield protocols.
Data Processing: The pipeline processes the retrieved data into actionable insights or outputs.
Examples include:
Token trend analysis.
Yield/APY calculations.
Price predictions or market sentiment analysis.
End-to-End Workflow: The pipeline can execute its end-to-end workflow without crashing, including data retrieval, processing, and output generation.
2. Data Accuracy & Relevance
Correct Data Outputs: The pipeline delivers accurate and logically structured outputs based on the retrieved data.
Examples include:
Accurate token prices or trading volumes.
Reliable APY projections or arbitrage opportunities.
Relevant responses to user queries in prompt pipelines.
Relevance: The pipeline outputs align with the specific use case (e.g., on-chain data analysis or market analysis & prediction).
3. Stability & Scalability
The pipeline runs reliably without frequent errors or crashes.
It can handle standard workloads (e.g., retrieving and processing data for common DeFi use cases).
The pipeline can adapt to various execution environments (e.g., local, containerized, or cloud-based setups).
4. Documentation
As the tester, you should provide clear documentation of the pipeline, including:
Use case description.
Data sources used.
Pipeline workflow.
Expected outputs.
Test cases or example queries are included to validate functionality.
Last updated