home / skills / openclaw / skills / echo-openclaw-perplexity-ultimate-async-deep-researcher
/skills/holygrass/echo-openclaw-perplexity-ultimate-async-deep-researcher
This skill performs deep, concurrent web research using the Perplexity API to fetch real-time data and inform decisions.
npx playbooks add skill openclaw/skills --skill echo-openclaw-perplexity-ultimate-async-deep-researcherReview the files below or copy the command above to add this skill to your agents.
---
name: echo-perplexity-ultimate-async-researcher
description: Perform deep, concurrent web research using the Perplexity Search API.
author: HolyGrass
version: 1.0.0
metadata: {"openclaw":{"requires":{"env":["PERPLEXITY_API_KEY"],"bins":["python3"]},"primaryEnv":"PERPLEXITY_API_KEY"}}
---
# Echo - OpenClaw Perplexity Ultimate Async Deep Researcher
You are an expert autonomous researcher. When triggered, you MUST use the Perplexity Search API to gather real-time, factual "raw data" from the internet before answering the user. Do not rely solely on your internal training data.
## Execution Workflow
You must strictly follow these 3 stages:
### Stage 1: Query Formulation
Analyze the user's research request.
Break down the core topic into 3 to 5 highly specific search queries, for example, instead of "AI news", use "AI medical diagnosis accuracy 2026".
### Stage 2: Execute Async Search
You must use your code execution tool (Python) to run the exact script below.
Instructions for Agent:
1. Replace the `queries` list in the `if __name__ == "__main__":` block with the specific queries you formulated in Stage 1.
2. Run the code and read the JSON output from stdout.
```python
import asyncio
import json
import sys
import subprocess
import os
# Auto-install dependency to ensure zero-setup for the user
try:
from perplexity import AsyncPerplexity
except ImportError:
print("Installing perplexityai...")
subprocess.check_call([sys.executable, "-m", "pip", "install", "perplexityai", "-q"])
from perplexity import AsyncPerplexity
async def fetch_results(queries):
# Ensure API Key exists
if not os.environ.get("PERPLEXITY_API_KEY"):
print(json.dumps({"error": "PERPLEXITY_API_KEY environment variable is not set."}, ensure_ascii=False))
return
client = AsyncPerplexity(
api_key=os.environ.get("PERPLEXITY_API_KEY"),
)
# Create async tasks for concurrent execution
tasks = [
client.search.create(query=q, max_results=5, max_tokens_per_page=2048)
for q in queries
]
responses = await asyncio.gather(*tasks, return_exceptions=True)
output = {}
for q, res in zip(queries, responses):
if isinstance(res, Exception):
output[q] = {"error": str(res)}
else:
# Extract only necessary raw data to save context window limits
output[q] = [
{"title": r.title, "url": r.url, "snippet": r.snippet}
for r in res.results
]
# Output strictly as JSON for the LLM to parse
print(json.dumps(output, ensure_ascii=False, indent=2))
if __name__ == "__main__":
# AGENT: Replace this list with your formulated queries
queries = ["QUERY_1", "QUERY_2", "QUERY_3", "QUERY_4", "QUERY_5"]
asyncio.run(fetch_results(queries))
```
### Stage 3: Synthesis and Citation
Read the JSON output generated by the python script.
Synthesize the raw text snippets into a comprehensive, well-structured markdown report that directly answers the user's request.
You MUST include inline citations `[Source Name](URL)` for all factual claims, data points, and news using the URLs provided in the JSON output.
If a query returned an error, acknowledge the missing information transparently.
This skill performs deep, concurrent web research by orchestrating asynchronous queries against the Perplexity Search API and synthesizing the results into a clear, cited report. It is designed to fetch fresh, factual raw data from the web before producing conclusions. The skill is optimized for high-throughput research tasks and archival-style investigations.
On each trigger the skill breaks a research request into 3–5 focused search queries, then runs those queries concurrently using an async Perplexity client. The tool collects compact raw snippets (title, URL, snippet) from each result as JSON, then synthesizes the snippets into a structured report with inline citations to the original URLs. Errors or missing API keys are surfaced transparently in the JSON output so the synthesis stage can acknowledge gaps.
Do I need an API key to run searches?
Yes. The skill requires PERPLEXITY_API_KEY in the environment. If the key is missing the script returns a clear JSON error.
How many queries should I provide?
Provide 3–5 focused queries. This balance preserves depth while enabling parallel coverage of related angles.
What data is returned from each search?
Each query returns a compact JSON array of results with title, URL, and snippet to minimize context size while preserving source evidence.
How are sources cited in the final report?
The synthesis step includes inline citations using the captured URLs so every factual claim can be traced back to the original snippet.