home / skills / openclaw / skills / knowledge

knowledge skill

/skills/zhbgyj/knowledge

This skill helps you search and switch between local knowledge base and AnythingLLM, enabling fast document retrieval and offline access.

npx playbooks add skill openclaw/skills --skill knowledge

Review the files below or copy the command above to add this skill to your agents.

Files (3)
SKILL.md
458 B
---
name: knowledge
description: 本地知识库集成 - 文档检索、投喂、双轨模式切换
---

# 本地知识库

## 使用方法

| 命令 | 说明 |
|------|------|
| "查一下 xxx" | 搜索知识库 |
| "切换到本地知识库" | 使用本地检索 |
| "切换到 AnythingLLM" | 使用对话模式 |
| "知识库统计" | 查看文档数量 |

## 文件位置

- 知识库根目录:`E:/knowledge-base`
- API服务:`http://127.0.0.1:8001`

Overview

This skill integrates a local knowledge base for document retrieval, ingestion, and a dual-mode switch between retrieval and conversational modes. It provides simple commands to search the archive, feed new documents, and view corpus statistics. The skill runs against a local folder and exposes a local API endpoint for retrieval operations.

How this skill works

The skill indexes files stored in a configurable local directory and serves retrieval results via a local HTTP API. It supports feeding new documents into the index (ingestion) and a dual-mode operation: a retrieval mode for precise document search and a conversational mode for free-form dialogue. Commands toggle modes, query the index, and return document counts or search results.

When to use it

  • When you need fast, private document search without sending data to external services.
  • When you want to combine direct retrieval with a conversational assistant.
  • When maintaining an offline archive or backup of skill-related documents.
  • During development or testing of local LLM integrations and retrieval pipelines.
  • When you require an easy way to ingest and monitor a local corpus.

Best practices

  • Keep the knowledge root directory on a stable drive and back it up regularly.
  • Use clear file naming and folder organization to improve retrieval relevance.
  • Re-index after large bulk ingestions to ensure search freshness.
  • Limit API exposure to localhost or secure the endpoint when needed.
  • Monitor document counts and index health with periodic checks.

Example use cases

  • Search a product manual or policy stored in the local knowledge root for quick answers.
  • Switch to conversational mode to ask follow-up questions that combine retrieval results with dialogue.
  • Feed new research papers into the local index and re-run searches to surface the latest content.
  • Run periodic statistics to verify the number of documents stored and detect ingestion failures.
  • Use the local API for integration with other tools or scripts that require offline retrieval.

FAQ

Where are documents stored?

Documents are stored in a configurable local folder; update the knowledge root path to point to your archive.

How do I switch modes?

Use the provided toggle commands to switch between retrieval (local knowledge base) and conversational (AnythingLLM) modes.

Is the API exposed remotely?

By default the service listens on localhost. Restrict or secure the endpoint if you expose it beyond the machine.