home / skills / yoanbernabeu / supabase-pentest-skills / supabase-audit-buckets-list
This skill inventories all Supabase storage buckets and configurations, highlighting public exposure and RLS gaps to strengthen storage security.
npx playbooks add skill yoanbernabeu/supabase-pentest-skills --skill supabase-audit-buckets-listReview the files below or copy the command above to add this skill to your agents.
---
name: supabase-audit-buckets-list
description: List all storage buckets and their configuration to identify the storage attack surface.
---
# List Storage Buckets
> š“ **CRITICAL: PROGRESSIVE FILE UPDATES REQUIRED**
>
> You MUST write to context files **AS YOU GO**, not just at the end.
> - Write to `.sb-pentest-context.json` **IMMEDIATELY after each bucket discovered**
> - Log to `.sb-pentest-audit.log` **BEFORE and AFTER each operation**
> - **DO NOT** wait until the skill completes to update files
> - If the skill crashes or is interrupted, all prior findings must already be saved
>
> **This is not optional. Failure to write progressively is a critical error.**
This skill discovers all storage buckets configured in a Supabase project.
## When to Use This Skill
- To inventory all storage buckets
- Before testing bucket access permissions
- To identify publicly accessible buckets
- As part of storage security audit
## Prerequisites
- Supabase URL and anon key available
- Detection completed
## Understanding Supabase Storage
Supabase Storage provides:
```
https://[project].supabase.co/storage/v1/
```
Buckets can be:
- **Public**: Files accessible without authentication
- **Private**: Files require authentication and RLS policies
## Storage API Endpoints
| Endpoint | Purpose |
|----------|---------|
| `/storage/v1/bucket` | List buckets |
| `/storage/v1/object/list/[bucket]` | List files in bucket |
| `/storage/v1/object/[bucket]/[path]` | Access file |
| `/storage/v1/object/public/[bucket]/[path]` | Public file URL |
## Usage
### Basic Bucket List
```
List storage buckets on my Supabase project
```
### With Configuration Details
```
List all buckets with their security settings
```
## Output Format
```
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
STORAGE BUCKETS
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Project: abc123def.supabase.co
Buckets Found: 5
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Bucket Inventory
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
1. avatars
āāā Public: ā
YES
āāā File Size Limit: 1MB
āāā Allowed MIME: image/jpeg, image/png, image/webp
āāā Files (estimated): 1,247
āāā Status: ā¹ļø Expected public bucket
Public URLs pattern:
https://abc123def.supabase.co/storage/v1/object/public/avatars/[filename]
2. documents
āāā Public: ā NO (Private)
āāā File Size Limit: 50MB
āāā Allowed MIME: application/pdf, application/msword, *
āāā Files (estimated): 523
āāā Status: ā
Private, needs RLS verification
3. uploads
āāā Public: ā
YES
āāā File Size Limit: 100MB
āāā Allowed MIME: */* (ANY)
āāā Files (estimated): 3,891
āāā Status: š P1 - Public with unrestricted MIME types
Risk: Any file type can be uploaded and accessed
Recommendation: Restrict allowed MIME types
4. backups
āāā Public: ā
YES ā UNEXPECTED
āāā File Size Limit: 500MB
āāā Allowed MIME: */*
āāā Files (estimated): 45
āāā Status: š“ P0 - Sensitive bucket is PUBLIC
Risk: Backup files publicly accessible!
Immediate Action: Change to private bucket
5. temp
āāā Public: ā NO
āāā File Size Limit: 10MB
āāā Allowed MIME: */*
āāā Files (estimated): 12
āāā Status: ā
Private temporary storage
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Summary
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Total Buckets: 5
Public Buckets: 3
āāā Expected Public: 1 (avatars)
āāā P1 Issues: 1 (uploads - unrestricted MIME)
āāā P0 Critical: 1 (backups - should be private)
Private Buckets: 2
āāā Need RLS verification with supabase-audit-buckets-read
Next Steps:
āāā Fix 'backups' bucket - make private immediately
āāā Restrict MIME types on 'uploads' bucket
āāā Test RLS on private buckets
āāā Verify no sensitive files in public buckets
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
```
## Bucket Configuration Analysis
| Config | Good | Bad |
|--------|------|-----|
| public: false | ā
Private by default | ā public: true for sensitive data |
| fileSizeLimit | ā
Appropriate limits | ā No limit or very large |
| allowedMimeTypes | ā
Restricted list | ā `*/*` allows anything |
## Context Output
```json
{
"storage": {
"buckets": [
{
"name": "avatars",
"public": true,
"file_size_limit": 1048576,
"allowed_mime_types": ["image/jpeg", "image/png", "image/webp"],
"estimated_files": 1247,
"risk_level": "info",
"expected_public": true
},
{
"name": "backups",
"public": true,
"file_size_limit": 524288000,
"allowed_mime_types": ["*/*"],
"estimated_files": 45,
"risk_level": "P0",
"finding": "Sensitive bucket publicly accessible"
}
],
"summary": {
"total": 5,
"public": 3,
"private": 2,
"p0_issues": 1,
"p1_issues": 1
}
}
}
```
## Security Recommendations
### For Public Buckets
```sql
-- Create restrictive RLS policy even for public buckets
CREATE POLICY "Public read avatars"
ON storage.objects FOR SELECT
USING (bucket_id = 'avatars');
CREATE POLICY "Users upload own avatar"
ON storage.objects FOR INSERT
WITH CHECK (
bucket_id = 'avatars'
AND auth.uid()::text = (storage.foldername(name))[1]
);
```
### For Private Buckets
```sql
-- Only owners can access their files
CREATE POLICY "Users access own documents"
ON storage.objects FOR ALL
USING (
bucket_id = 'documents'
AND auth.uid()::text = (storage.foldername(name))[1]
);
```
### Fix Public Backup Bucket
```sql
-- Make bucket private
UPDATE storage.buckets
SET public = false
WHERE name = 'backups';
-- Add strict RLS
CREATE POLICY "Only admins access backups"
ON storage.objects FOR ALL
USING (
bucket_id = 'backups'
AND (SELECT is_admin FROM profiles WHERE id = auth.uid())
);
```
## Common Issues
ā **Problem:** Cannot list buckets
ā
**Solution:** Storage API may be restricted. This is actually good security. Note as "unable to enumerate."
ā **Problem:** Many buckets found
ā
**Solution:** Large applications may have many. Focus on public buckets first.
ā **Problem:** Bucket count doesn't match expected
ā
**Solution:** Some buckets may be created dynamically. Check application code.
## MANDATORY: Progressive Context File Updates
ā ļø **This skill MUST update tracking files PROGRESSIVELY during execution, NOT just at the end.**
### Critical Rule: Write As You Go
**DO NOT** batch all writes at the end. Instead:
1. **Before starting bucket enumeration** ā Log the action to `.sb-pentest-audit.log`
2. **After each bucket discovered** ā Immediately update `.sb-pentest-context.json`
3. **After each configuration analyzed** ā Log the result
This ensures that if the skill is interrupted, crashes, or times out, all findings up to that point are preserved.
### Required Actions (Progressive)
1. **Update `.sb-pentest-context.json`** with results:
```json
{
"storage": {
"buckets": [ ... ],
"summary": { "total": 5, "public": 3, "private": 2 }
}
}
```
2. **Log to `.sb-pentest-audit.log`**:
```
[TIMESTAMP] [supabase-audit-buckets-list] [START] Listing storage buckets
[TIMESTAMP] [supabase-audit-buckets-list] [SUCCESS] Found 5 buckets
[TIMESTAMP] [supabase-audit-buckets-list] [CONTEXT_UPDATED] .sb-pentest-context.json updated
```
3. **If files don't exist**, create them before writing.
**FAILURE TO UPDATE CONTEXT FILES IS NOT ACCEPTABLE.**
## MANDATORY: Evidence Collection
š **Evidence Directory:** `.sb-pentest-evidence/04-storage-audit/`
### Evidence Files to Create
| File | Content |
|------|---------|
| `buckets-config.json` | All bucket configurations |
| `buckets/[name]/file-list.json` | File listing per bucket |
### Evidence Format
```json
{
"evidence_id": "STG-LIST-001",
"timestamp": "2025-01-31T10:35:00Z",
"category": "storage-audit",
"type": "bucket_enumeration",
"request": {
"method": "GET",
"url": "https://abc123def.supabase.co/storage/v1/bucket",
"curl_command": "curl -s '$URL/storage/v1/bucket' -H 'apikey: $ANON_KEY' -H 'Authorization: Bearer $ANON_KEY'"
},
"buckets": [
{
"name": "avatars",
"public": true,
"file_size_limit": 1048576,
"allowed_mime_types": ["image/jpeg", "image/png"],
"risk_level": "info",
"assessment": "Appropriate for public avatars"
},
{
"name": "backups",
"public": true,
"file_size_limit": 524288000,
"allowed_mime_types": ["*/*"],
"risk_level": "P0",
"assessment": "CRITICAL: Backup bucket should not be public"
}
],
"summary": {
"total_buckets": 5,
"public_buckets": 3,
"private_buckets": 2,
"critical_misconfigurations": 1
}
}
```
### Add to curl-commands.sh
```bash
# === STORAGE BUCKET ENUMERATION ===
# List all buckets
curl -s "$SUPABASE_URL/storage/v1/bucket" \
-H "apikey: $ANON_KEY" \
-H "Authorization: Bearer $ANON_KEY"
# List files in specific bucket
curl -s "$SUPABASE_URL/storage/v1/object/list/backups" \
-H "apikey: $ANON_KEY" \
-H "Authorization: Bearer $ANON_KEY"
```
## Related Skills
- `supabase-audit-buckets-read` ā Attempt to read files
- `supabase-audit-buckets-public` ā Find misconfigured public buckets
- `supabase-audit-storage-rls` ā Test storage RLS policies
This skill enumerates all Supabase storage buckets and reports their configuration so you can quickly identify the storage attack surface. It provides per-bucket details (public flag, size limits, allowed MIME types, estimated file counts) and a concise risk summary with remediation guidance. The skill is designed for integration in automated security audits and manual penetration tests.
The skill queries the Supabase Storage API to list buckets and inspects each bucket's configuration and metadata. It flags public buckets, unrestricted MIME settings, large size limits, and other indicators of misconfiguration. Results are returned as a structured inventory and a summary that highlights critical and priority issues, plus recommended SQL/RLS fixes.
What credentials are required?
An active Supabase URL and anon key are needed to call the storage list endpoint.
What if listing buckets is blocked?
If enumeration is restricted, note it as 'unable to enumerate' ā restricted APIs are a positive control.
How are public buckets identified?
By the bucket 'public' flag and by checking public object URL patterns provided by the Storage API.