home / skills / openclaw / skills / doge-oss-skill

doge-oss-skill skill

/skills/white0dew/doge-oss-skill

This skill uploads a local file to DogeCloud OSS and returns a public URL plus detailed metadata for easy sharing.

npx playbooks add skill openclaw/skills --skill doge-oss-skill

Review the files below or copy the command above to add this skill to your agents.

Files (5)
SKILL.md
2.4 KB
---
name: doge-oss-upload
description: Upload a local file to DogeCloud OSS (DogeCloud 对象存储) and return a public URL + metadata. Use when the user asks to “upload to doge/dogecloud”, “生成公网链接”, “把截图传到 OSS 并给我链接”.
metadata:
  openclaw:
    requires:
      bins: [python3]
---

# Doge Upload Public Info

Use the bundled uploader script to upload a local file with DogeCloud temporary credentials and print machine-readable public access info.

Read `references/dogecloud-oss.md` when API details are needed.

## Quick Start

1. Export environment variables (camelCase or `DOGECLOUD_*` both supported):
   - `accessKey` / `DOGECLOUD_ACCESS_KEY`
   - `secretKey` / `DOGECLOUD_SECRET_KEY`
   - `bucket` / `DOGECLOUD_BUCKET` (bucket name or `s3Bucket`)
   - `endpoint` / `DOGECLOUD_ENDPOINT`
   - `publicBaseUrl` / `DOGECLOUD_PUBLIC_BASE_URL`
   - `prefix` / `DOGECLOUD_PREFIX`
2. Install dependencies:

```bash
python3 -m pip install -U boto3 requests
```

3. Run:

```bash
python3 scripts/doge_upload_public_info.py ./local/file.jpg
```

## Workflow

1. Resolve bucket from env/CLI (support bucket name or `s3Bucket`).
2. Resolve upload key from `--key`, otherwise use `prefix/<local-filename>`.
3. Request temporary credentials from `/auth/tmp_token.json` using scoped permissions.
4. Upload file with Boto3 S3 client and returned `s3Bucket` + `s3Endpoint`.
5. Return JSON with upload metadata and public URL candidates.

## Output Contract

Return a JSON object with:
- `bucket`, `s3_bucket`, `s3_endpoint`, `object_key`
- `file` metadata (`path`, `size_bytes`, `md5`)
- `tmp_token` metadata (`channel`, `expired_at`)
- `public_info`:
  - `primary_url`
  - `candidates` (custom domain, derived test domain, and S3 endpoint style candidate)
  - `notes`

## Guardrails

- Keep permanent AccessKey/SecretKey on server side only.
- Default to `OSS_UPLOAD` for least privilege; use `OSS_FULL` only when explicitly required.
- If required env vars are missing, fail fast and print a clear reminder listing all missing keys.
- Warn that test domains ending with `.oss.dogecdn.com` can expire (commonly after 30 days).
- Prefer `--public-base-url` when the user requests a production-ready public URL.

## Resources

### scripts/
- `scripts/doge_upload_public_info.py`: upload and public-info extractor CLI.

### references/
- `references/dogecloud-oss.md`: minimal API notes and URL caveats from official docs.

Overview

This skill uploads a local file to DogeCloud OSS and returns a machine-readable JSON with a public URL and upload metadata. It uses temporary credentials, supports environment-configured buckets and prefixes, and prints candidate public URLs for quick sharing. The tool is designed for scripted or CLI workflows where you need a reproducible public access record.

How this skill works

The CLI resolves bucket and key from environment variables or arguments, requests scoped temporary credentials from a token endpoint, and performs the upload with a Boto3 S3 client to the provided s3Bucket and s3Endpoint. After upload it computes file metadata (size, MD5), returns the tmp_token info, and emits a set of public URL candidates plus a recommended primary_url. Missing environment variables trigger a clear fail-fast message.

When to use it

  • You need a public URL for a local file quickly from the CLI or a script.
  • You want to upload screenshots, backups, or assets to DogeCloud OSS with temporary credentials.
  • You require machine-readable upload metadata (MD5, size, object key) for automation.
  • You must avoid storing permanent secrets in client-side code and prefer scoped temporary tokens.
  • You need multiple URL candidates (custom domain, test domain, S3-style) for testing or production.

Best practices

  • Keep permanent AccessKey and SecretKey only on a secure server and use the temporary token endpoint for clients.
  • Set a publicBaseUrl when you need a production-ready primary URL rather than a test domain.
  • Use least privilege: default to OSS_UPLOAD scope and only request OSS_FULL when necessary.
  • Validate required environment variables at start and fail fast with a list of missing keys.
  • Warn users that test domains like *.oss.dogecdn.com may expire and should not be relied on for long-term links.

Example use cases

  • Upload a screenshot and get a shareable public URL from a development machine.
  • Automate nightly backup uploads and store the returned JSON metadata in a database for audit.
  • Integrate with CI to push build artifacts to DogeCloud and record the object_key and MD5 for later retrieval.
  • Provide temporary upload capability to a frontend by exchanging server-held credentials for scoped tmp_token.

FAQ

What environment variables are required?

Provide accessKey/secretKey (or DOGECLOUD_* variants), bucket (or s3Bucket), endpoint, and optionally publicBaseUrl and prefix. The tool will list any missing keys and exit.

How are public URLs chosen?

The script emits multiple candidates: a custom publicBaseUrl if provided, a derived test domain from the service, and an S3-style endpoint URL. It also recommends a primary_url based on publicBaseUrl when available.

Are permanent keys safe in this workflow?

Permanent keys should remain on a trusted server. Clients should use the temporary token endpoint which returns scoped credentials with limited lifetime and permissions.