[2.22.0] Add rclone skill and enhance feature-video command

- Add rclone skill for uploading to S3, Cloudflare R2, Backblaze B2
- Update /feature-video with better ffmpeg commands (proper scaling)
- Add rclone integration for cloud uploads in feature-video
- 27 agents, 20 commands, 13 skills, 2 MCP servers

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
Kieran Klaassen
2026-01-05 11:07:02 -08:00
parent fd21f58264
commit 84890f1e16
6 changed files with 270 additions and 39 deletions

View File

@@ -1,7 +1,7 @@
{ {
"name": "compound-engineering", "name": "compound-engineering",
"version": "2.21.0", "version": "2.22.0",
"description": "AI-powered development tools. 27 agents, 20 commands, 12 skills, 2 MCP servers for code review, research, design, and workflow automation.", "description": "AI-powered development tools. 27 agents, 20 commands, 13 skills, 2 MCP servers for code review, research, design, and workflow automation.",
"author": { "author": {
"name": "Kieran Klaassen", "name": "Kieran Klaassen",
"email": "kieran@every.to", "email": "kieran@every.to",

View File

@@ -5,6 +5,26 @@ All notable changes to the compound-engineering plugin will be documented in thi
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [2.22.0] - 2026-01-05
### Added
- **`rclone` skill** - Upload files to S3, Cloudflare R2, Backblaze B2, and other cloud storage providers
### Changed
- **`/feature-video` command** - Enhanced with:
- Better ffmpeg commands for video/GIF creation (proper scaling, framerate control)
- rclone integration for cloud uploads
- Screenshot copying to project folder
- Improved upload options workflow
### Summary
- 27 agents, 20 commands, 13 skills, 2 MCP servers
---
## [2.21.0] - 2026-01-05 ## [2.21.0] - 2026-01-05
### Fixed ### Fixed

View File

@@ -8,7 +8,7 @@ AI-powered development tools that get smarter with every use. Make each unit of
|-----------|-------| |-----------|-------|
| Agents | 27 | | Agents | 27 |
| Commands | 20 | | Commands | 20 |
| Skills | 12 | | Skills | 13 |
| MCP Servers | 2 | | MCP Servers | 2 |
## Agents ## Agents
@@ -128,6 +128,12 @@ Core workflow commands use `workflows:` prefix to avoid collisions with built-in
| `file-todos` | File-based todo tracking system | | `file-todos` | File-based todo tracking system |
| `git-worktree` | Manage Git worktrees for parallel development | | `git-worktree` | Manage Git worktrees for parallel development |
### File Transfer
| Skill | Description |
|-------|-------------|
| `rclone` | Upload files to S3, Cloudflare R2, Backblaze B2, and cloud storage |
### Image Generation ### Image Generation
| Skill | Description | | Skill | Description |

View File

@@ -24,7 +24,8 @@ This command creates professional video walkthroughs of features for PR document
- Local development server running (e.g., `bin/dev`, `rails server`) - Local development server running (e.g., `bin/dev`, `rails server`)
- Playwright MCP server connected - Playwright MCP server connected
- Git repository with a PR to document - Git repository with a PR to document
- `ffmpeg` installed (for video conversion if needed) - `ffmpeg` installed (for video conversion)
- `rclone` configured (optional, for cloud upload - see rclone skill)
</requirements> </requirements>
## Main Tasks ## Main Tasks
@@ -180,58 +181,52 @@ mcp__plugin_compound-engineering_pw__browser_take_screenshot({ filename: "tmp/sc
``` ```
**Create video/GIF from screenshots:** **Create video/GIF from screenshots:**
```bash ```bash
# Create GIF from screenshots # Create directories
ffmpeg -framerate 1 -pattern_type glob -i 'tmp/screenshots/*.png' \ mkdir -p tmp/videos tmp/screenshots
# Create MP4 video (RECOMMENDED - better quality, smaller size)
# -framerate 0.5 = 2 seconds per frame (slower playback)
# -framerate 1 = 1 second per frame
ffmpeg -y -framerate 0.5 -pattern_type glob -i '.playwright-mcp/tmp/screenshots/*.png' \
-c:v libx264 -pix_fmt yuv420p -vf "scale=1280:-2" \
tmp/videos/feature-demo.mp4
# Create GIF (larger file, but works everywhere)
ffmpeg -y -framerate 0.7 -pattern_type glob -i '.playwright-mcp/tmp/screenshots/*.png' \
-vf "scale=1280:-1:flags=lanczos,split[s0][s1];[s0]palettegen[p];[s1][p]paletteuse" \ -vf "scale=1280:-1:flags=lanczos,split[s0][s1];[s0]palettegen[p];[s1][p]paletteuse" \
-loop 0 tmp/videos/feature-demo.gif -loop 0 tmp/videos/feature-demo.gif
# Or create MP4 for better quality # Copy screenshots to project folder for easy access
ffmpeg -framerate 1 -pattern_type glob -i 'tmp/screenshots/*.png' \ cp -r .playwright-mcp/tmp/screenshots tmp/
-c:v libx264 -pix_fmt yuv420p -vf "scale=1280:-1" \
tmp/videos/feature-demo.mp4
``` ```
**Note:** The `-2` in scale ensures height is divisible by 2 (required for H.264).
</record_walkthrough> </record_walkthrough>
### 6. Upload the Video ### 6. Upload the Video
<upload_video> <upload_video>
**Option A: Upload to GitHub (via PR comment with attachment)** **Upload with rclone:**
GitHub doesn't support direct video uploads via API, but you can:
1. Drag-drop in browser, or
2. Use a hosting service
**Option B: Upload to transfer.sh (temporary, 14 days)**
```bash ```bash
curl --upload-file tmp/videos/feature-demo.gif https://transfer.sh/feature-demo.gif # Check rclone is configured
rclone listremotes
# Upload video and screenshots to cloud storage
rclone copy tmp/videos/feature-demo.mp4 r2:your-bucket/pr-videos/ --progress
rclone copy tmp/screenshots/ r2:your-bucket/pr-videos/screenshots/ --progress
# List uploaded files
rclone ls r2:your-bucket/pr-videos/
``` ```
**Option C: Upload to Cloudflare R2/S3 (if configured)** The public URL depends on your bucket configuration. For R2 with public access:
```bash
# If AWS CLI is configured
aws s3 cp tmp/videos/feature-demo.gif s3://your-bucket/pr-videos/pr-[number]-demo.gif --acl public-read
``` ```
https://pub-XXXXX.r2.dev/pr-videos/feature-demo.mp4
**Option D: Keep local and provide path**
```bash
# Just provide the local path for manual upload
echo "Video saved to: $(pwd)/tmp/videos/feature-demo.gif"
```
Ask user for upload preference:
```markdown
**Video Ready**
Video saved to: `tmp/videos/feature-demo.gif`
Size: [size]
How would you like to share it?
1. Upload to transfer.sh (temporary link, 14 days)
2. Keep local - I'll upload manually
3. Upload to S3/R2 (requires config)
``` ```
</upload_video> </upload_video>

View File

@@ -0,0 +1,150 @@
---
name: rclone
description: Upload, sync, and manage files across cloud storage providers using rclone. Use when uploading files (images, videos, documents) to S3, Cloudflare R2, Backblaze B2, Google Drive, Dropbox, or any S3-compatible storage. Triggers on "upload to S3", "sync to cloud", "rclone", "backup files", "upload video/image to bucket", or requests to transfer files to remote storage.
---
# rclone File Transfer Skill
## Setup Check (Always Run First)
Before any rclone operation, verify installation and configuration:
```bash
# Check if rclone is installed
command -v rclone >/dev/null 2>&1 && echo "rclone installed: $(rclone version | head -1)" || echo "NOT INSTALLED"
# List configured remotes
rclone listremotes 2>/dev/null || echo "NO REMOTES CONFIGURED"
```
### If rclone is NOT installed
Guide the user to install:
```bash
# macOS
brew install rclone
# Linux (script install)
curl https://rclone.org/install.sh | sudo bash
# Or via package manager
sudo apt install rclone # Debian/Ubuntu
sudo dnf install rclone # Fedora
```
### If NO remotes are configured
Walk the user through interactive configuration:
```bash
rclone config
```
**Common provider setup quick reference:**
| Provider | Type | Key Settings |
|----------|------|--------------|
| AWS S3 | `s3` | access_key_id, secret_access_key, region |
| Cloudflare R2 | `s3` | access_key_id, secret_access_key, endpoint (account_id.r2.cloudflarestorage.com) |
| Backblaze B2 | `b2` | account (keyID), key (applicationKey) |
| DigitalOcean Spaces | `s3` | access_key_id, secret_access_key, endpoint (region.digitaloceanspaces.com) |
| Google Drive | `drive` | OAuth flow (opens browser) |
| Dropbox | `dropbox` | OAuth flow (opens browser) |
**Example: Configure Cloudflare R2**
```bash
rclone config create r2 s3 \
provider=Cloudflare \
access_key_id=YOUR_ACCESS_KEY \
secret_access_key=YOUR_SECRET_KEY \
endpoint=ACCOUNT_ID.r2.cloudflarestorage.com \
acl=private
```
**Example: Configure AWS S3**
```bash
rclone config create aws s3 \
provider=AWS \
access_key_id=YOUR_ACCESS_KEY \
secret_access_key=YOUR_SECRET_KEY \
region=us-east-1
```
## Common Operations
### Upload single file
```bash
rclone copy /path/to/file.mp4 remote:bucket/path/ --progress
```
### Upload directory
```bash
rclone copy /path/to/folder remote:bucket/folder/ --progress
```
### Sync directory (mirror, deletes removed files)
```bash
rclone sync /local/path remote:bucket/path/ --progress
```
### List remote contents
```bash
rclone ls remote:bucket/
rclone lsd remote:bucket/ # directories only
```
### Check what would be transferred (dry run)
```bash
rclone copy /path remote:bucket/ --dry-run
```
## Useful Flags
| Flag | Purpose |
|------|---------|
| `--progress` | Show transfer progress |
| `--dry-run` | Preview without transferring |
| `-v` | Verbose output |
| `--transfers=N` | Parallel transfers (default 4) |
| `--bwlimit=RATE` | Bandwidth limit (e.g., `10M`) |
| `--checksum` | Compare by checksum, not size/time |
| `--exclude="*.tmp"` | Exclude patterns |
| `--include="*.mp4"` | Include only matching |
| `--min-size=SIZE` | Skip files smaller than SIZE |
| `--max-size=SIZE` | Skip files larger than SIZE |
## Large File Uploads
For videos and large files, use chunked uploads:
```bash
# S3 multipart upload (automatic for >200MB)
rclone copy large_video.mp4 remote:bucket/ --s3-chunk-size=64M --progress
# Resume interrupted transfers
rclone copy /path remote:bucket/ --progress --retries=5
```
## Verify Upload
```bash
# Check file exists and matches
rclone check /local/file remote:bucket/file
# Get file info
rclone lsl remote:bucket/path/to/file
```
## Troubleshooting
```bash
# Test connection
rclone lsd remote:
# Debug connection issues
rclone lsd remote: -vv
# Check config
rclone config show remote
```

View File

@@ -0,0 +1,60 @@
#!/bin/bash
# rclone setup checker - verifies installation and configuration
set -e
echo "=== rclone Setup Check ==="
echo
# Check if rclone is installed
if command -v rclone >/dev/null 2>&1; then
echo "✓ rclone installed"
rclone version | head -1
echo
else
echo "✗ rclone NOT INSTALLED"
echo
echo "Install with:"
echo " macOS: brew install rclone"
echo " Linux: curl https://rclone.org/install.sh | sudo bash"
echo " or: sudo apt install rclone"
exit 1
fi
# Check for configured remotes
REMOTES=$(rclone listremotes 2>/dev/null || true)
if [ -z "$REMOTES" ]; then
echo "✗ No remotes configured"
echo
echo "Run 'rclone config' to set up a remote, or use:"
echo
echo " # Cloudflare R2"
echo " rclone config create r2 s3 provider=Cloudflare \\"
echo " access_key_id=KEY secret_access_key=SECRET \\"
echo " endpoint=ACCOUNT_ID.r2.cloudflarestorage.com"
echo
echo " # AWS S3"
echo " rclone config create aws s3 provider=AWS \\"
echo " access_key_id=KEY secret_access_key=SECRET region=us-east-1"
echo
exit 1
else
echo "✓ Configured remotes:"
echo "$REMOTES" | sed 's/^/ /'
echo
fi
# Test connectivity for each remote
echo "Testing remote connectivity..."
for remote in $REMOTES; do
remote_name="${remote%:}"
if rclone lsd "$remote" >/dev/null 2>&1; then
echo "$remote_name - connected"
else
echo "$remote_name - connection failed (check credentials)"
fi
done
echo
echo "=== Setup Complete ==="