feat: dark factory — autonomous CI/CD agents for harb
Three agents extracted from ~/scripts/harb-{dev,review}/:
- dev/ — pull-based dev agent (find ready issues → implement → PR → merge)
- review/ — AI code review (structured verdicts, follow-up issues)
- factory/ — supervisor (bash health checks, auto-fix, escalation)
All secrets externalized to .env (see .env.example).
Shared env/helpers in lib/env.sh.
This commit is contained in:
commit
cb24968d9b
10 changed files with 2848 additions and 0 deletions
31
.env.example
Normal file
31
.env.example
Normal file
|
|
@ -0,0 +1,31 @@
|
||||||
|
# Dark Factory — Environment Configuration
|
||||||
|
# Copy to .env and fill in your values.
|
||||||
|
# NEVER commit .env to the repo.
|
||||||
|
|
||||||
|
# Codeberg API token (read from ~/.netrc by default, override here if needed)
|
||||||
|
# CODEBERG_TOKEN=
|
||||||
|
|
||||||
|
# Codeberg review bot token (separate account for formal reviews)
|
||||||
|
REVIEW_BOT_TOKEN=
|
||||||
|
|
||||||
|
# Woodpecker CI API token
|
||||||
|
WOODPECKER_TOKEN=
|
||||||
|
|
||||||
|
# Woodpecker CI server URL
|
||||||
|
WOODPECKER_SERVER=http://localhost:8000
|
||||||
|
|
||||||
|
# Woodpecker Postgres (for direct DB queries)
|
||||||
|
WOODPECKER_DB_PASSWORD=
|
||||||
|
WOODPECKER_DB_USER=woodpecker
|
||||||
|
WOODPECKER_DB_HOST=127.0.0.1
|
||||||
|
WOODPECKER_DB_NAME=woodpecker
|
||||||
|
|
||||||
|
# Target Codeberg repo
|
||||||
|
CODEBERG_REPO=johba/harb
|
||||||
|
CODEBERG_API=https://codeberg.org/api/v1/repos/johba/harb
|
||||||
|
|
||||||
|
# Harb repo local path
|
||||||
|
HARB_REPO_ROOT=/home/debian/harb
|
||||||
|
|
||||||
|
# Claude CLI timeout (seconds)
|
||||||
|
CLAUDE_TIMEOUT=7200
|
||||||
11
.gitignore
vendored
Normal file
11
.gitignore
vendored
Normal file
|
|
@ -0,0 +1,11 @@
|
||||||
|
# Secrets
|
||||||
|
.env
|
||||||
|
|
||||||
|
# Runtime state
|
||||||
|
*.log
|
||||||
|
state.json
|
||||||
|
*.lock
|
||||||
|
*.pid
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
103
README.md
Normal file
103
README.md
Normal file
|
|
@ -0,0 +1,103 @@
|
||||||
|
# 🏭 Dark Factory
|
||||||
|
|
||||||
|
Autonomous CI/CD factory for [harb](https://codeberg.org/johba/harb). Three agents, zero supervision needed.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
cron (*/10) ──→ factory-poll.sh ← supervisor (bash checks, zero tokens)
|
||||||
|
├── all clear? → exit 0
|
||||||
|
└── problem? → alert (or claude -p for complex fixes)
|
||||||
|
|
||||||
|
cron (*/10) ──→ dev-poll.sh ← pulls ready issues, spawns dev-agent
|
||||||
|
└── dev-agent.sh ← claude -p: implement → PR → CI → review → merge
|
||||||
|
|
||||||
|
cron (*/10) ──→ review-poll.sh ← finds unreviewed PRs, spawns review
|
||||||
|
└── review-pr.sh ← claude -p: review → approve/request changes
|
||||||
|
```
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Clone
|
||||||
|
git clone ssh://git@codeberg.org/johba/dark-factory.git
|
||||||
|
cd dark-factory
|
||||||
|
|
||||||
|
# 2. Configure
|
||||||
|
cp .env.example .env
|
||||||
|
# Fill in your tokens (see .env.example for descriptions)
|
||||||
|
|
||||||
|
# 3. Install cron
|
||||||
|
crontab -e
|
||||||
|
# Add:
|
||||||
|
# */10 * * * * /path/to/dark-factory/factory/factory-poll.sh
|
||||||
|
# */10 * * * * /path/to/dark-factory/dev/dev-poll.sh
|
||||||
|
# */10 * * * * /path/to/dark-factory/review/review-poll.sh
|
||||||
|
|
||||||
|
# 4. Verify
|
||||||
|
bash factory/factory-poll.sh # should log "all clear"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Directory Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
dark-factory/
|
||||||
|
├── .env.example # Template — copy to .env, add secrets
|
||||||
|
├── .gitignore # Excludes .env, logs, state files
|
||||||
|
├── lib/
|
||||||
|
│ └── env.sh # Shared: load .env, PATH, API helpers
|
||||||
|
├── dev/
|
||||||
|
│ ├── dev-poll.sh # Cron entry: find ready issues
|
||||||
|
│ ├── dev-agent.sh # Implementation agent (claude -p)
|
||||||
|
│ └── ci-debug.sh # Woodpecker CI log helper
|
||||||
|
├── review/
|
||||||
|
│ ├── review-poll.sh # Cron entry: find unreviewed PRs
|
||||||
|
│ └── review-pr.sh # Review agent (claude -p)
|
||||||
|
└── factory/
|
||||||
|
└── factory-poll.sh # Supervisor: health checks + auto-fix
|
||||||
|
```
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
### Dev Agent (Pull System)
|
||||||
|
1. `dev-poll.sh` scans `backlog`-labeled issues
|
||||||
|
2. Checks if all dependencies are merged into master
|
||||||
|
3. Picks the first ready issue, spawns `dev-agent.sh`
|
||||||
|
4. Agent: creates worktree → `claude -p` implements → commits → pushes → creates PR
|
||||||
|
5. Waits for CI. If CI fails: feeds errors back to claude (max 2 attempts per phase)
|
||||||
|
6. Waits for review. If REQUEST_CHANGES: feeds review back to claude
|
||||||
|
7. On APPROVE: merges PR, cleans up, closes issue
|
||||||
|
|
||||||
|
### Review Agent
|
||||||
|
1. `review-poll.sh` finds open PRs with passing CI and no review
|
||||||
|
2. Spawns `review-pr.sh` which runs `claude -p` to review the diff
|
||||||
|
3. Posts structured review comment with verdict (APPROVE / REQUEST_CHANGES / DISCUSS)
|
||||||
|
4. Creates follow-up issues for pre-existing bugs found during review
|
||||||
|
|
||||||
|
### Factory Supervisor
|
||||||
|
1. `factory-poll.sh` runs pure bash checks every 10 minutes:
|
||||||
|
- CI: stuck or failing pipelines
|
||||||
|
- PRs: derailed (CI fail + no activity)
|
||||||
|
- Dev-agent: alive and making progress
|
||||||
|
- Git: clean state on master
|
||||||
|
- Infra: RAM, swap, disk, Anvil health
|
||||||
|
- Review: unreviewed PRs with passing CI
|
||||||
|
2. Auto-fixes simple issues (restart Anvil, retrigger CI)
|
||||||
|
3. Escalates complex issues via openclaw system event
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
- [Claude CLI](https://docs.anthropic.com/en/docs/claude-cli) (`claude` in PATH)
|
||||||
|
- [Foundry](https://getfoundry.sh/) (`forge`, `cast`, `anvil`)
|
||||||
|
- [Woodpecker CI](https://woodpecker-ci.org/) (local instance)
|
||||||
|
- PostgreSQL client (`psql`)
|
||||||
|
- [OpenClaw](https://openclaw.ai/) (for system event notifications, optional)
|
||||||
|
- `jq`, `curl`, `git`
|
||||||
|
|
||||||
|
## Design Principles
|
||||||
|
|
||||||
|
- **Bash for checks, AI for fixes** — don't burn tokens on health checks
|
||||||
|
- **Pull system** — readiness derived from merged dependencies, not labels
|
||||||
|
- **CI fix loop** — each phase gets fresh retry budget
|
||||||
|
- **Prior art** — dev-agent searches closed PRs to avoid rework
|
||||||
|
- **No secrets in repo** — everything via `.env`
|
||||||
79
dev/ci-debug.sh
Executable file
79
dev/ci-debug.sh
Executable file
|
|
@ -0,0 +1,79 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# ci-debug.sh — Query Woodpecker CI (CLI for logs, API for structure)
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# ci-debug.sh status [pipeline] — pipeline overview + step states
|
||||||
|
# ci-debug.sh logs <pipeline> <step#> — full logs for a step
|
||||||
|
# ci-debug.sh failures [pipeline] — all failed step logs
|
||||||
|
# ci-debug.sh list [count] — recent pipelines (default 10)
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Load shared environment
|
||||||
|
source "$(dirname "$0")/../lib/env.sh"
|
||||||
|
|
||||||
|
export WOODPECKER_SERVER="http://localhost:8000"
|
||||||
|
# WOODPECKER_TOKEN loaded from .env via env.sh
|
||||||
|
REPO="johba/harb"
|
||||||
|
API="${WOODPECKER_SERVER}/api/repos/2"
|
||||||
|
|
||||||
|
api() {
|
||||||
|
curl -sf -H "Authorization: Bearer ${WOODPECKER_TOKEN}" "${API}/$1"
|
||||||
|
}
|
||||||
|
|
||||||
|
get_latest() {
|
||||||
|
api "pipelines?per_page=1" | jq -r '.[0].number'
|
||||||
|
}
|
||||||
|
|
||||||
|
case "${1:-help}" in
|
||||||
|
list)
|
||||||
|
COUNT="${2:-10}"
|
||||||
|
api "pipelines?per_page=${COUNT}" | \
|
||||||
|
jq -r '.[] | "#\(.number) \(.status) \(.event) \(.commit[:7]) \(.message | split("\n")[0][:60])"'
|
||||||
|
;;
|
||||||
|
|
||||||
|
status)
|
||||||
|
P="${2:-$(get_latest)}"
|
||||||
|
echo "Pipeline #${P}:"
|
||||||
|
api "pipelines/${P}" | \
|
||||||
|
jq -r '" Status: \(.status) Event: \(.event) Commit: \(.commit[:7])"'
|
||||||
|
echo "Steps:"
|
||||||
|
api "pipelines/${P}" | \
|
||||||
|
jq -r '.workflows[]? | " [\(.name)]", (.children[]? | " [\(.pid)] \(.name) → \(.state) (exit \(.exit_code))")'
|
||||||
|
;;
|
||||||
|
|
||||||
|
logs)
|
||||||
|
P="${2:?Usage: ci-debug.sh logs <pipeline> <step#>}"
|
||||||
|
S="${3:?Usage: ci-debug.sh logs <pipeline> <step#>}"
|
||||||
|
woodpecker-cli pipeline log show "$REPO" "$P" "$S"
|
||||||
|
;;
|
||||||
|
|
||||||
|
failures)
|
||||||
|
P="${2:-$(get_latest)}"
|
||||||
|
FAILED=$(api "pipelines/${P}" | \
|
||||||
|
jq -r '.workflows[]?.children[]? | select(.state=="failure") | "\(.pid)\t\(.name)"')
|
||||||
|
|
||||||
|
if [ -z "$FAILED" ]; then
|
||||||
|
echo "No failed steps in pipeline #${P}"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
while IFS=$'\t' read -r pid name; do
|
||||||
|
echo "=== FAILED: ${name} (step ${pid}) ==="
|
||||||
|
woodpecker-cli pipeline log show "$REPO" "$P" "$pid" 2>/dev/null | tail -200
|
||||||
|
echo ""
|
||||||
|
done <<< "$FAILED"
|
||||||
|
;;
|
||||||
|
|
||||||
|
help|*)
|
||||||
|
cat <<'EOF'
|
||||||
|
ci-debug.sh — Query Woodpecker CI
|
||||||
|
|
||||||
|
Commands:
|
||||||
|
list [count] Recent pipelines (default 10)
|
||||||
|
status [pipeline] Pipeline overview + step states
|
||||||
|
logs <pipeline> <step#> Full step logs (step# = pid from status)
|
||||||
|
failures [pipeline] All failed step logs (last 200 lines each)
|
||||||
|
EOF
|
||||||
|
;;
|
||||||
|
esac
|
||||||
1205
dev/dev-agent.sh
Executable file
1205
dev/dev-agent.sh
Executable file
File diff suppressed because it is too large
Load diff
331
dev/dev-poll.sh
Executable file
331
dev/dev-poll.sh
Executable file
|
|
@ -0,0 +1,331 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# dev-poll.sh — Pull-based factory: find the next ready issue and start dev-agent
|
||||||
|
#
|
||||||
|
# Pull system: issues labeled "backlog" are candidates. An issue is READY when
|
||||||
|
# ALL its dependency issues are closed AND their PRs are merged into master.
|
||||||
|
# No "todo" label needed — readiness is derived from reality.
|
||||||
|
#
|
||||||
|
# Priority:
|
||||||
|
# 1. Orphaned "in-progress" issues (agent died or PR needs attention)
|
||||||
|
# 2. Ready "backlog" issues (all deps merged)
|
||||||
|
#
|
||||||
|
# Usage: cron every 10min
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Load shared environment
|
||||||
|
source "$(dirname "$0")/../lib/env.sh"
|
||||||
|
|
||||||
|
|
||||||
|
REPO="${CODEBERG_REPO}"
|
||||||
|
|
||||||
|
API="${CODEBERG_API}"
|
||||||
|
LOCKFILE="/tmp/dev-agent.lock"
|
||||||
|
LOGFILE="${FACTORY_ROOT}/dev/dev-agent.log"
|
||||||
|
PREFLIGHT_RESULT="/tmp/dev-agent-preflight.json"
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
|
||||||
|
log() {
|
||||||
|
printf '[%s] poll: %s\n' "$(date -u '+%Y-%m-%d %H:%M:%S UTC')" "$*" >> "$LOGFILE"
|
||||||
|
}
|
||||||
|
|
||||||
|
# --- Check if dev-agent already running ---
|
||||||
|
if [ -f "$LOCKFILE" ]; then
|
||||||
|
LOCK_PID=$(cat "$LOCKFILE" 2>/dev/null || echo "")
|
||||||
|
if [ -n "$LOCK_PID" ] && kill -0 "$LOCK_PID" 2>/dev/null; then
|
||||||
|
log "agent running (PID ${LOCK_PID})"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
rm -f "$LOCKFILE"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Memory guard ---
|
||||||
|
AVAIL_MB=$(awk '/MemAvailable/{printf "%d", $2/1024}' /proc/meminfo)
|
||||||
|
if [ "$AVAIL_MB" -lt 2000 ]; then
|
||||||
|
log "SKIP: only ${AVAIL_MB}MB available (need 2000MB)"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# HELPER: check if a dependency issue is fully resolved (closed + PR merged)
|
||||||
|
# =============================================================================
|
||||||
|
dep_is_merged() {
|
||||||
|
local dep_num="$1"
|
||||||
|
|
||||||
|
# Check issue is closed
|
||||||
|
local dep_state
|
||||||
|
dep_state=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/issues/${dep_num}" | jq -r '.state // "open"')
|
||||||
|
if [ "$dep_state" != "closed" ]; then
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check there's a merged PR for this issue
|
||||||
|
# Search closed PRs for title containing "#NNN" or body containing "Fixes #NNN"
|
||||||
|
local has_merged
|
||||||
|
has_merged=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls?state=closed&limit=30" | \
|
||||||
|
jq -r --arg num "#${dep_num}" \
|
||||||
|
'[.[] | select(.merged == true) | select((.title | contains($num)) or (.body // "" | test("ixes " + $num + "\\b"; "i")))] | length')
|
||||||
|
|
||||||
|
[ "${has_merged:-0}" -gt 0 ]
|
||||||
|
}
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# HELPER: extract dependency numbers from issue body
|
||||||
|
# =============================================================================
|
||||||
|
get_deps() {
|
||||||
|
local issue_body="$1"
|
||||||
|
# Extract #NNN references from "Depends on" / "Blocked by" sections
|
||||||
|
# Capture the header line AND subsequent lines until next ## section
|
||||||
|
{
|
||||||
|
echo "$issue_body" | awk '
|
||||||
|
BEGIN { IGNORECASE=1 }
|
||||||
|
/^##? *(Depends on|Blocked by|Dependencies)/ { capture=1; next }
|
||||||
|
capture && /^##? / { capture=0 }
|
||||||
|
capture { print }
|
||||||
|
' | grep -oP '#\K[0-9]+' || true
|
||||||
|
# Also check inline deps on same line as keyword
|
||||||
|
echo "$issue_body" | grep -iE '(depends on|blocked by)' | grep -oP '#\K[0-9]+' || true
|
||||||
|
} | sort -un
|
||||||
|
}
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# HELPER: check if issue is ready (all deps merged)
|
||||||
|
# =============================================================================
|
||||||
|
issue_is_ready() {
|
||||||
|
local issue_num="$1"
|
||||||
|
local issue_body="$2"
|
||||||
|
|
||||||
|
local deps
|
||||||
|
deps=$(get_deps "$issue_body")
|
||||||
|
|
||||||
|
if [ -z "$deps" ]; then
|
||||||
|
# No dependencies — always ready
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
while IFS= read -r dep; do
|
||||||
|
[ -z "$dep" ] && continue
|
||||||
|
if ! dep_is_merged "$dep"; then
|
||||||
|
log " #${issue_num} blocked: dep #${dep} not merged"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
done <<< "$deps"
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# PRIORITY 1: orphaned in-progress issues
|
||||||
|
# =============================================================================
|
||||||
|
log "checking for in-progress issues"
|
||||||
|
ORPHANS_JSON=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/issues?state=open&labels=in-progress&limit=10&type=issues")
|
||||||
|
|
||||||
|
ORPHAN_COUNT=$(echo "$ORPHANS_JSON" | jq 'length')
|
||||||
|
if [ "$ORPHAN_COUNT" -gt 0 ]; then
|
||||||
|
ISSUE_NUM=$(echo "$ORPHANS_JSON" | jq -r '.[0].number')
|
||||||
|
|
||||||
|
# Check if there's already an open PR for this issue
|
||||||
|
HAS_PR=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls?state=open&limit=20" | \
|
||||||
|
jq -r --arg branch "fix/issue-${ISSUE_NUM}" \
|
||||||
|
'.[] | select(.head.ref == $branch) | .number' | head -1) || true
|
||||||
|
|
||||||
|
if [ -n "$HAS_PR" ]; then
|
||||||
|
PR_SHA=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls/${HAS_PR}" | jq -r '.head.sha') || true
|
||||||
|
CI_STATE=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/commits/${PR_SHA}/status" | jq -r '.state // "unknown"') || true
|
||||||
|
|
||||||
|
# Check formal reviews
|
||||||
|
HAS_APPROVE=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls/${HAS_PR}/reviews" | \
|
||||||
|
jq -r '[.[] | select(.state == "APPROVED") | select(.stale == false)] | length') || true
|
||||||
|
HAS_CHANGES=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls/${HAS_PR}/reviews" | \
|
||||||
|
jq -r '[.[] | select(.state == "REQUEST_CHANGES") | select(.stale == false)] | length') || true
|
||||||
|
|
||||||
|
if [ "$CI_STATE" = "success" ] && [ "${HAS_APPROVE:-0}" -gt 0 ]; then
|
||||||
|
log "PR #${HAS_PR} approved + CI green → merging"
|
||||||
|
MERGE_CODE=$(curl -s -o /dev/null -w "%{http_code}" -X POST \
|
||||||
|
-H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API}/pulls/${HAS_PR}/merge" \
|
||||||
|
-d '{"Do":"merge","delete_branch_after_merge":true}')
|
||||||
|
|
||||||
|
if [ "$MERGE_CODE" = "200" ] || [ "$MERGE_CODE" = "204" ] || [ "$MERGE_CODE" = "405" ]; then
|
||||||
|
log "PR #${HAS_PR} merged! Closing #${ISSUE_NUM}"
|
||||||
|
curl -sf -X PATCH -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API}/issues/${ISSUE_NUM}" -d '{"state":"closed"}' >/dev/null 2>&1 || true
|
||||||
|
curl -sf -X DELETE -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/issues/${ISSUE_NUM}/labels/in-progress" >/dev/null 2>&1 || true
|
||||||
|
openclaw system event --text "✅ PR #${HAS_PR} merged! Issue #${ISSUE_NUM} done." --mode now 2>/dev/null || true
|
||||||
|
else
|
||||||
|
log "merge failed (HTTP ${MERGE_CODE})"
|
||||||
|
fi
|
||||||
|
exit 0
|
||||||
|
|
||||||
|
elif [ "$CI_STATE" = "success" ] && [ "${HAS_CHANGES:-0}" -gt 0 ]; then
|
||||||
|
log "issue #${ISSUE_NUM} PR #${HAS_PR} has REQUEST_CHANGES — spawning agent"
|
||||||
|
nohup "${SCRIPT_DIR}/dev-agent.sh" "$ISSUE_NUM" >> "$LOGFILE" 2>&1 &
|
||||||
|
log "started dev-agent PID $! for issue #${ISSUE_NUM} (review fix)"
|
||||||
|
exit 0
|
||||||
|
|
||||||
|
elif [ "$CI_STATE" = "failure" ] || [ "$CI_STATE" = "error" ]; then
|
||||||
|
log "issue #${ISSUE_NUM} PR #${HAS_PR} CI failed — spawning agent to fix"
|
||||||
|
nohup "${SCRIPT_DIR}/dev-agent.sh" "$ISSUE_NUM" >> "$LOGFILE" 2>&1 &
|
||||||
|
log "started dev-agent PID $! for issue #${ISSUE_NUM} (CI fix)"
|
||||||
|
exit 0
|
||||||
|
|
||||||
|
else
|
||||||
|
log "issue #${ISSUE_NUM} has open PR #${HAS_PR} (CI: ${CI_STATE}, waiting)"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log "recovering orphaned issue #${ISSUE_NUM} (no PR found)"
|
||||||
|
nohup "${SCRIPT_DIR}/dev-agent.sh" "$ISSUE_NUM" >> "$LOGFILE" 2>&1 &
|
||||||
|
log "started dev-agent PID $! for issue #${ISSUE_NUM} (recovery)"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# PRIORITY 2: find ready backlog issues (pull system)
|
||||||
|
# =============================================================================
|
||||||
|
log "scanning backlog for ready issues"
|
||||||
|
BACKLOG_JSON=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/issues?state=open&labels=backlog&limit=20&type=issues")
|
||||||
|
|
||||||
|
BACKLOG_COUNT=$(echo "$BACKLOG_JSON" | jq 'length')
|
||||||
|
if [ "$BACKLOG_COUNT" -eq 0 ]; then
|
||||||
|
log "no backlog issues"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
log "found ${BACKLOG_COUNT} backlog issues"
|
||||||
|
|
||||||
|
# Check each for readiness
|
||||||
|
READY_ISSUE=""
|
||||||
|
for i in $(seq 0 $((BACKLOG_COUNT - 1))); do
|
||||||
|
ISSUE_NUM=$(echo "$BACKLOG_JSON" | jq -r ".[$i].number")
|
||||||
|
ISSUE_BODY=$(echo "$BACKLOG_JSON" | jq -r ".[$i].body // \"\"")
|
||||||
|
|
||||||
|
if ! issue_is_ready "$ISSUE_NUM" "$ISSUE_BODY"; then
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if there's already an open PR for this issue that needs attention
|
||||||
|
EXISTING_PR=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls?state=open&limit=20" | \
|
||||||
|
jq -r --arg branch "fix/issue-${ISSUE_NUM}" --arg num "#${ISSUE_NUM}" \
|
||||||
|
'.[] | select((.head.ref == $branch) or (.title | contains($num))) | .number' | head -1) || true
|
||||||
|
|
||||||
|
if [ -n "$EXISTING_PR" ]; then
|
||||||
|
PR_SHA=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls/${EXISTING_PR}" | jq -r '.head.sha') || true
|
||||||
|
CI_STATE=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/commits/${PR_SHA}/status" | jq -r '.state // "unknown"') || true
|
||||||
|
HAS_APPROVE=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls/${EXISTING_PR}/reviews" | \
|
||||||
|
jq -r '[.[] | select(.state == "APPROVED") | select(.stale == false)] | length') || true
|
||||||
|
HAS_CHANGES=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API}/pulls/${EXISTING_PR}/reviews" | \
|
||||||
|
jq -r '[.[] | select(.state == "REQUEST_CHANGES") | select(.stale == false)] | length') || true
|
||||||
|
|
||||||
|
if [ "$CI_STATE" = "success" ] && [ "${HAS_APPROVE:-0}" -gt 0 ]; then
|
||||||
|
log "#${ISSUE_NUM} PR #${EXISTING_PR} approved + CI green → merging"
|
||||||
|
MERGE_CODE=$(curl -s -o /dev/null -w "%{http_code}" -X POST \
|
||||||
|
-H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API}/pulls/${EXISTING_PR}/merge" \
|
||||||
|
-d '{"Do":"merge","delete_branch_after_merge":true}')
|
||||||
|
if [ "$MERGE_CODE" = "200" ] || [ "$MERGE_CODE" = "204" ] || [ "$MERGE_CODE" = "405" ]; then
|
||||||
|
log "PR #${EXISTING_PR} merged! Closing #${ISSUE_NUM}"
|
||||||
|
curl -sf -X PATCH -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API}/issues/${ISSUE_NUM}" -d '{"state":"closed"}' >/dev/null 2>&1 || true
|
||||||
|
openclaw system event --text "✅ PR #${EXISTING_PR} merged! Issue #${ISSUE_NUM} done." --mode now 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
continue
|
||||||
|
|
||||||
|
elif [ "${HAS_CHANGES:-0}" -gt 0 ]; then
|
||||||
|
log "#${ISSUE_NUM} PR #${EXISTING_PR} has REQUEST_CHANGES — picking up"
|
||||||
|
READY_ISSUE="$ISSUE_NUM"
|
||||||
|
break
|
||||||
|
|
||||||
|
elif [ "$CI_STATE" = "failure" ] || [ "$CI_STATE" = "error" ]; then
|
||||||
|
log "#${ISSUE_NUM} PR #${EXISTING_PR} CI failed — picking up"
|
||||||
|
READY_ISSUE="$ISSUE_NUM"
|
||||||
|
break
|
||||||
|
|
||||||
|
else
|
||||||
|
log "#${ISSUE_NUM} PR #${EXISTING_PR} exists (CI: ${CI_STATE}, waiting)"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
READY_ISSUE="$ISSUE_NUM"
|
||||||
|
log "#${ISSUE_NUM} is READY (all deps merged, no existing PR)"
|
||||||
|
break
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ -z "$READY_ISSUE" ]; then
|
||||||
|
log "no ready issues (all blocked by unmerged deps)"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# LAUNCH: start dev-agent for the ready issue
|
||||||
|
# =============================================================================
|
||||||
|
log "launching dev-agent for #${READY_ISSUE}"
|
||||||
|
rm -f "$PREFLIGHT_RESULT"
|
||||||
|
|
||||||
|
nohup "${SCRIPT_DIR}/dev-agent.sh" "$READY_ISSUE" >> "$LOGFILE" 2>&1 &
|
||||||
|
AGENT_PID=$!
|
||||||
|
|
||||||
|
# Wait briefly for preflight (agent writes result before claiming)
|
||||||
|
for w in $(seq 1 30); do
|
||||||
|
if [ -f "$PREFLIGHT_RESULT" ]; then
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
if ! kill -0 "$AGENT_PID" 2>/dev/null; then
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
sleep 2
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ -f "$PREFLIGHT_RESULT" ]; then
|
||||||
|
PREFLIGHT_STATUS=$(jq -r '.status // "unknown"' < "$PREFLIGHT_RESULT")
|
||||||
|
rm -f "$PREFLIGHT_RESULT"
|
||||||
|
|
||||||
|
case "$PREFLIGHT_STATUS" in
|
||||||
|
ready)
|
||||||
|
log "dev-agent running for #${READY_ISSUE}"
|
||||||
|
;;
|
||||||
|
unmet_dependency)
|
||||||
|
log "#${READY_ISSUE} has code-level dependency (preflight blocked)"
|
||||||
|
wait "$AGENT_PID" 2>/dev/null || true
|
||||||
|
;;
|
||||||
|
too_large)
|
||||||
|
REASON=$(jq -r '.reason // "unspecified"' < "$PREFLIGHT_RESULT" 2>/dev/null || echo "unspecified")
|
||||||
|
log "#${READY_ISSUE} too large: ${REASON}"
|
||||||
|
# Label as underspecified
|
||||||
|
curl -sf -X POST -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API}/issues/${READY_ISSUE}/labels" \
|
||||||
|
-d '{"labels":["underspecified"]}' >/dev/null 2>&1 || true
|
||||||
|
;;
|
||||||
|
already_done)
|
||||||
|
log "#${READY_ISSUE} already done"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
log "#${READY_ISSUE} unknown preflight: ${PREFLIGHT_STATUS}"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
elif kill -0 "$AGENT_PID" 2>/dev/null; then
|
||||||
|
log "dev-agent running for #${READY_ISSUE} (passed preflight)"
|
||||||
|
else
|
||||||
|
log "dev-agent exited for #${READY_ISSUE} without preflight result"
|
||||||
|
fi
|
||||||
195
factory/factory-poll.sh
Executable file
195
factory/factory-poll.sh
Executable file
|
|
@ -0,0 +1,195 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# factory-poll.sh — Factory supervisor: bash checks + claude -p for fixes
|
||||||
|
#
|
||||||
|
# Runs every 10min via cron. Does all health checks in bash (zero tokens).
|
||||||
|
# Only invokes claude -p when intervention is needed.
|
||||||
|
#
|
||||||
|
# Cron: */10 * * * * /path/to/dark-factory/factory/factory-poll.sh
|
||||||
|
#
|
||||||
|
# Peek: cat /tmp/factory-status
|
||||||
|
# Log: tail -f /path/to/dark-factory/factory/factory.log
|
||||||
|
|
||||||
|
source "$(dirname "$0")/../lib/env.sh"
|
||||||
|
|
||||||
|
LOGFILE="${FACTORY_ROOT}/factory/factory.log"
|
||||||
|
STATUSFILE="/tmp/factory-status"
|
||||||
|
LOCKFILE="/tmp/factory-poll.lock"
|
||||||
|
|
||||||
|
# Prevent overlapping runs
|
||||||
|
if [ -f "$LOCKFILE" ]; then
|
||||||
|
LOCK_PID=$(cat "$LOCKFILE" 2>/dev/null)
|
||||||
|
if kill -0 "$LOCK_PID" 2>/dev/null; then
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
rm -f "$LOCKFILE"
|
||||||
|
fi
|
||||||
|
echo $$ > "$LOCKFILE"
|
||||||
|
trap 'rm -f "$LOCKFILE" "$STATUSFILE"' EXIT
|
||||||
|
|
||||||
|
status() {
|
||||||
|
printf '[%s] factory: %s\n' "$(date -u '+%Y-%m-%d %H:%M:%S UTC')" "$*" > "$STATUSFILE"
|
||||||
|
log "$*" >> "$LOGFILE"
|
||||||
|
}
|
||||||
|
|
||||||
|
ALERTS=""
|
||||||
|
alert() {
|
||||||
|
ALERTS="${ALERTS}• $*\n"
|
||||||
|
log "ALERT: $*" >> "$LOGFILE"
|
||||||
|
}
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# CHECK 1: Stuck/failed CI pipelines
|
||||||
|
# =============================================================================
|
||||||
|
status "checking CI"
|
||||||
|
|
||||||
|
STUCK_CI=$(wpdb -c "SELECT count(*) FROM pipelines WHERE repo_id=2 AND status='running' AND EXTRACT(EPOCH FROM now() - to_timestamp(started)) > 1200;" 2>/dev/null | xargs)
|
||||||
|
[ "${STUCK_CI:-0}" -gt 0 ] && alert "CI: ${STUCK_CI} pipeline(s) running >20min"
|
||||||
|
|
||||||
|
PENDING_CI=$(wpdb -c "SELECT count(*) FROM pipelines WHERE repo_id=2 AND status='pending' AND EXTRACT(EPOCH FROM now() - to_timestamp(created)) > 1800;" 2>/dev/null | xargs)
|
||||||
|
[ "${PENDING_CI:-0}" -gt 0 ] && alert "CI: ${PENDING_CI} pipeline(s) pending >30min"
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# CHECK 2: Derailed PRs — open with CI failure + no push in 30min
|
||||||
|
# =============================================================================
|
||||||
|
status "checking PRs"
|
||||||
|
|
||||||
|
OPEN_PRS=$(codeberg_api GET "/pulls?state=open&limit=10" 2>/dev/null | jq -r '.[].number' 2>/dev/null || true)
|
||||||
|
for pr in $OPEN_PRS; do
|
||||||
|
PR_SHA=$(codeberg_api GET "/pulls/${pr}" 2>/dev/null | jq -r '.head.sha' 2>/dev/null || true)
|
||||||
|
[ -z "$PR_SHA" ] && continue
|
||||||
|
|
||||||
|
CI_STATE=$(codeberg_api GET "/commits/${PR_SHA}/status" 2>/dev/null | jq -r '.state // "unknown"' 2>/dev/null || true)
|
||||||
|
if [ "$CI_STATE" = "failure" ] || [ "$CI_STATE" = "error" ]; then
|
||||||
|
# Check when last push happened
|
||||||
|
UPDATED=$(codeberg_api GET "/pulls/${pr}" 2>/dev/null | jq -r '.updated_at // ""' 2>/dev/null || true)
|
||||||
|
if [ -n "$UPDATED" ]; then
|
||||||
|
UPDATED_EPOCH=$(date -d "$UPDATED" +%s 2>/dev/null || echo 0)
|
||||||
|
NOW_EPOCH=$(date +%s)
|
||||||
|
AGE_MIN=$(( (NOW_EPOCH - UPDATED_EPOCH) / 60 ))
|
||||||
|
if [ "$AGE_MIN" -gt 30 ]; then
|
||||||
|
alert "PR #${pr}: CI=${CI_STATE}, no activity for ${AGE_MIN}min"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# CHECK 3: Dev-agent health
|
||||||
|
# =============================================================================
|
||||||
|
status "checking dev-agent"
|
||||||
|
|
||||||
|
DEV_LOCK="/tmp/dev-agent.lock"
|
||||||
|
if [ -f "$DEV_LOCK" ]; then
|
||||||
|
DEV_PID=$(cat "$DEV_LOCK" 2>/dev/null)
|
||||||
|
if ! kill -0 "$DEV_PID" 2>/dev/null; then
|
||||||
|
alert "Dev-agent: lock file exists but PID ${DEV_PID} is dead (stale lock)"
|
||||||
|
else
|
||||||
|
# Check if it's making progress — same status for >30min?
|
||||||
|
DEV_STATUS=$(cat /tmp/dev-agent-status 2>/dev/null || echo "")
|
||||||
|
DEV_STATUS_AGE=$(stat -c %Y /tmp/dev-agent-status 2>/dev/null || echo 0)
|
||||||
|
NOW_EPOCH=$(date +%s)
|
||||||
|
STATUS_AGE_MIN=$(( (NOW_EPOCH - DEV_STATUS_AGE) / 60 ))
|
||||||
|
if [ "$STATUS_AGE_MIN" -gt 30 ]; then
|
||||||
|
alert "Dev-agent: status unchanged for ${STATUS_AGE_MIN}min — possibly stuck"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# CHECK 4: Git repo health
|
||||||
|
# =============================================================================
|
||||||
|
status "checking git repo"
|
||||||
|
|
||||||
|
cd "${HARB_REPO_ROOT}" 2>/dev/null || true
|
||||||
|
GIT_STATUS=$(git status --porcelain 2>/dev/null | wc -l)
|
||||||
|
GIT_BRANCH=$(git branch --show-current 2>/dev/null || echo "unknown")
|
||||||
|
GIT_REBASE=$([ -d .git/rebase-merge ] || [ -d .git/rebase-apply ] && echo "yes" || echo "no")
|
||||||
|
|
||||||
|
if [ "$GIT_REBASE" = "yes" ]; then
|
||||||
|
alert "Git: stale rebase in progress on main repo"
|
||||||
|
fi
|
||||||
|
if [ "$GIT_BRANCH" != "master" ]; then
|
||||||
|
alert "Git: main repo on branch '${GIT_BRANCH}' instead of master"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# CHECK 5: Infra — RAM, swap, disk, docker
|
||||||
|
# =============================================================================
|
||||||
|
status "checking infra"
|
||||||
|
|
||||||
|
AVAIL_MB=$(free -m | awk '/Mem:/{print $7}')
|
||||||
|
SWAP_USED_MB=$(free -m | awk '/Swap:/{print $3}')
|
||||||
|
DISK_PERCENT=$(df -h / | awk 'NR==2{print $5}' | tr -d '%')
|
||||||
|
|
||||||
|
if [ "${AVAIL_MB:-0}" -lt 500 ]; then
|
||||||
|
alert "RAM: only ${AVAIL_MB}MB available"
|
||||||
|
fi
|
||||||
|
if [ "${SWAP_USED_MB:-0}" -gt 3000 ]; then
|
||||||
|
alert "Swap: ${SWAP_USED_MB}MB used (>3GB)"
|
||||||
|
fi
|
||||||
|
if [ "${DISK_PERCENT:-0}" -gt 85 ]; then
|
||||||
|
alert "Disk: ${DISK_PERCENT}% full"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check if Anvil is responsive
|
||||||
|
ANVIL_OK=$(curl -sf -m 5 -X POST -H "Content-Type: application/json" \
|
||||||
|
-d '{"jsonrpc":"2.0","method":"eth_chainId","params":[],"id":1}' \
|
||||||
|
http://localhost:8545 2>/dev/null | jq -r '.result // "fail"' 2>/dev/null || echo "fail")
|
||||||
|
if [ "$ANVIL_OK" = "fail" ]; then
|
||||||
|
# Try to auto-fix
|
||||||
|
sudo docker restart harb-anvil-1 2>/dev/null && \
|
||||||
|
log "Auto-fixed: restarted frozen Anvil" >> "$LOGFILE" || \
|
||||||
|
alert "Anvil: unresponsive and restart failed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# CHECK 6: Review bot — unreviewed PRs older than 1h
|
||||||
|
# =============================================================================
|
||||||
|
status "checking review backlog"
|
||||||
|
|
||||||
|
for pr in $OPEN_PRS; do
|
||||||
|
PR_SHA=$(codeberg_api GET "/pulls/${pr}" 2>/dev/null | jq -r '.head.sha' 2>/dev/null || true)
|
||||||
|
[ -z "$PR_SHA" ] && continue
|
||||||
|
|
||||||
|
CI_STATE=$(codeberg_api GET "/commits/${PR_SHA}/status" 2>/dev/null | jq -r '.state // "unknown"' 2>/dev/null || true)
|
||||||
|
[ "$CI_STATE" != "success" ] && continue
|
||||||
|
|
||||||
|
# CI passed — check if reviewed at this SHA
|
||||||
|
HAS_REVIEW=$(codeberg_api GET "/issues/${pr}/comments?limit=50" 2>/dev/null | \
|
||||||
|
jq -r --arg sha "$PR_SHA" '[.[] | select(.body | contains("<!-- reviewed: " + $sha))] | length' 2>/dev/null || echo "0")
|
||||||
|
|
||||||
|
if [ "${HAS_REVIEW:-0}" -eq 0 ]; then
|
||||||
|
PR_UPDATED=$(codeberg_api GET "/pulls/${pr}" 2>/dev/null | jq -r '.updated_at // ""' 2>/dev/null || true)
|
||||||
|
if [ -n "$PR_UPDATED" ]; then
|
||||||
|
UPDATED_EPOCH=$(date -d "$PR_UPDATED" +%s 2>/dev/null || echo 0)
|
||||||
|
NOW_EPOCH=$(date +%s)
|
||||||
|
AGE_MIN=$(( (NOW_EPOCH - UPDATED_EPOCH) / 60 ))
|
||||||
|
if [ "$AGE_MIN" -gt 60 ]; then
|
||||||
|
alert "PR #${pr}: CI passed but no review for ${AGE_MIN}min"
|
||||||
|
# Auto-trigger review
|
||||||
|
bash "${FACTORY_ROOT}/review/review-pr.sh" "$pr" >> "$LOGFILE" 2>&1 &
|
||||||
|
log "Auto-triggered review for PR #${pr}" >> "$LOGFILE"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# RESULT: escalate or all clear
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
|
if [ -n "$ALERTS" ]; then
|
||||||
|
log "$(echo -e "$ALERTS")" >> "$LOGFILE"
|
||||||
|
|
||||||
|
# Determine if claude -p is needed (complex issues) or just notify
|
||||||
|
NEEDS_CLAUDE=false
|
||||||
|
|
||||||
|
# For now: notify via openclaw system event, let Clawy decide
|
||||||
|
ALERT_TEXT=$(echo -e "$ALERTS")
|
||||||
|
openclaw system event --text "🏭 Factory Alert:\n${ALERT_TEXT}" --mode now 2>/dev/null || true
|
||||||
|
|
||||||
|
status "alerts sent"
|
||||||
|
else
|
||||||
|
log "all clear" >> "$LOGFILE"
|
||||||
|
status "all clear"
|
||||||
|
fi
|
||||||
66
lib/env.sh
Executable file
66
lib/env.sh
Executable file
|
|
@ -0,0 +1,66 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# env.sh — Load environment and shared utilities
|
||||||
|
# Source this at the top of every script: source "$(dirname "$0")/lib/env.sh"
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Resolve script root (parent of lib/)
|
||||||
|
FACTORY_ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||||
|
|
||||||
|
# Load .env if present
|
||||||
|
if [ -f "$FACTORY_ROOT/.env" ]; then
|
||||||
|
set -a
|
||||||
|
# shellcheck source=/dev/null
|
||||||
|
source "$FACTORY_ROOT/.env"
|
||||||
|
set +a
|
||||||
|
fi
|
||||||
|
|
||||||
|
# PATH: foundry, node, system
|
||||||
|
export PATH="${HOME}/.foundry/bin:${HOME}/.nvm/versions/node/v22.20.0/bin:/usr/local/bin:/usr/bin:/bin:${PATH}"
|
||||||
|
export HOME="${HOME:-/home/debian}"
|
||||||
|
|
||||||
|
# Codeberg token: env var > ~/.netrc
|
||||||
|
if [ -z "${CODEBERG_TOKEN:-}" ]; then
|
||||||
|
CODEBERG_TOKEN="$(awk '/codeberg.org/{getline;getline;print $2}' ~/.netrc 2>/dev/null || true)"
|
||||||
|
fi
|
||||||
|
export CODEBERG_TOKEN
|
||||||
|
|
||||||
|
# Defaults
|
||||||
|
export CODEBERG_REPO="${CODEBERG_REPO:-johba/harb}"
|
||||||
|
export CODEBERG_API="${CODEBERG_API:-https://codeberg.org/api/v1/repos/${CODEBERG_REPO}}"
|
||||||
|
export HARB_REPO_ROOT="${HARB_REPO_ROOT:-/home/debian/harb}"
|
||||||
|
export WOODPECKER_SERVER="${WOODPECKER_SERVER:-http://localhost:8000}"
|
||||||
|
export CLAUDE_TIMEOUT="${CLAUDE_TIMEOUT:-7200}"
|
||||||
|
|
||||||
|
# Shared log helper
|
||||||
|
log() {
|
||||||
|
printf '[%s] %s\n' "$(date -u '+%Y-%m-%d %H:%M:%S UTC')" "$*"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Codeberg API helper — usage: codeberg_api GET /issues?state=open
|
||||||
|
codeberg_api() {
|
||||||
|
local method="$1" path="$2"
|
||||||
|
shift 2
|
||||||
|
curl -sf -X "$method" \
|
||||||
|
-H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${CODEBERG_API}${path}" "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Woodpecker API helper
|
||||||
|
woodpecker_api() {
|
||||||
|
local path="$1"
|
||||||
|
shift
|
||||||
|
curl -sf \
|
||||||
|
-H "Authorization: Bearer ${WOODPECKER_TOKEN}" \
|
||||||
|
"${WOODPECKER_SERVER}/api${path}" "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Woodpecker DB query helper
|
||||||
|
wpdb() {
|
||||||
|
PGPASSWORD="${WOODPECKER_DB_PASSWORD}" psql \
|
||||||
|
-U "${WOODPECKER_DB_USER:-woodpecker}" \
|
||||||
|
-h "${WOODPECKER_DB_HOST:-127.0.0.1}" \
|
||||||
|
-d "${WOODPECKER_DB_NAME:-woodpecker}" \
|
||||||
|
-t "$@" 2>/dev/null
|
||||||
|
}
|
||||||
93
review/review-poll.sh
Executable file
93
review/review-poll.sh
Executable file
|
|
@ -0,0 +1,93 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# review-poll.sh — Poll open PRs and review those with green CI
|
||||||
|
#
|
||||||
|
# Peek while running: cat /tmp/harb-review-status
|
||||||
|
# Full log: tail -f ~/scripts/harb-review/review.log
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Load shared environment
|
||||||
|
source "$(dirname "$0")/../lib/env.sh"
|
||||||
|
|
||||||
|
export HOME="${HOME:-/home/debian}"
|
||||||
|
|
||||||
|
REPO="${CODEBERG_REPO}"
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
|
||||||
|
API_BASE="${CODEBERG_API}"
|
||||||
|
LOGFILE="$SCRIPT_DIR/review.log"
|
||||||
|
MAX_REVIEWS=3
|
||||||
|
|
||||||
|
log() {
|
||||||
|
printf '[%s] %s\n' "$(date -u '+%Y-%m-%d %H:%M:%S UTC')" "$*" >> "$LOGFILE"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Log rotation
|
||||||
|
if [ -f "$LOGFILE" ]; then
|
||||||
|
LOGSIZE=$(stat -c%s "$LOGFILE" 2>/dev/null || echo 0)
|
||||||
|
if [ "$LOGSIZE" -gt 102400 ]; then
|
||||||
|
mv "$LOGFILE" "$LOGFILE.old"
|
||||||
|
log "Log rotated"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
log "--- Poll start ---"
|
||||||
|
|
||||||
|
PRS=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/pulls?state=open&limit=20" | \
|
||||||
|
jq -r '.[] | select(.base.ref == "master") | select(.draft != true) | select(.title | test("^\\[?WIP[\\]:]"; "i") | not) | "\(.number) \(.head.sha)"')
|
||||||
|
|
||||||
|
if [ -z "$PRS" ]; then
|
||||||
|
log "No open PRs targeting master"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
TOTAL=$(echo "$PRS" | wc -l)
|
||||||
|
log "Found ${TOTAL} open PRs"
|
||||||
|
|
||||||
|
REVIEWED=0
|
||||||
|
SKIPPED=0
|
||||||
|
|
||||||
|
while IFS= read -r line; do
|
||||||
|
PR_NUM=$(echo "$line" | awk '{print $1}')
|
||||||
|
PR_SHA=$(echo "$line" | awk '{print $2}')
|
||||||
|
|
||||||
|
CI_STATE=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/commits/${PR_SHA}/status" | jq -r '.state // "unknown"')
|
||||||
|
|
||||||
|
if [ "$CI_STATE" != "success" ]; then
|
||||||
|
log " #${PR_NUM} CI=${CI_STATE}, skip"
|
||||||
|
SKIPPED=$((SKIPPED + 1))
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check formal Codeberg reviews (not comment markers)
|
||||||
|
HAS_REVIEW=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/pulls/${PR_NUM}/reviews" | \
|
||||||
|
jq -r --arg sha "$PR_SHA" \
|
||||||
|
'[.[] | select(.commit_id == $sha) | select(.state != "COMMENT")] | length')
|
||||||
|
|
||||||
|
if [ "${HAS_REVIEW:-0}" -gt "0" ]; then
|
||||||
|
log " #${PR_NUM} formal review exists for ${PR_SHA:0:7}, skip"
|
||||||
|
SKIPPED=$((SKIPPED + 1))
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
log " #${PR_NUM} needs review (CI=success, SHA=${PR_SHA:0:7})"
|
||||||
|
|
||||||
|
if "${SCRIPT_DIR}/review-pr.sh" "$PR_NUM" 2>&1; then
|
||||||
|
REVIEWED=$((REVIEWED + 1))
|
||||||
|
else
|
||||||
|
log " #${PR_NUM} review failed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "$REVIEWED" -ge "$MAX_REVIEWS" ]; then
|
||||||
|
log "Hit max reviews (${MAX_REVIEWS}), stopping"
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
|
||||||
|
sleep 2
|
||||||
|
|
||||||
|
done <<< "$PRS"
|
||||||
|
|
||||||
|
log "--- Poll done: ${REVIEWED} reviewed, ${SKIPPED} skipped ---"
|
||||||
734
review/review-pr.sh
Executable file
734
review/review-pr.sh
Executable file
|
|
@ -0,0 +1,734 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
# review-pr.sh — AI-powered PR review using claude CLI
|
||||||
|
#
|
||||||
|
# Usage: ./review-pr.sh <pr-number> [--force]
|
||||||
|
#
|
||||||
|
# Features:
|
||||||
|
# - Full review on first pass
|
||||||
|
# - Incremental re-review when previous review exists (verifies findings addressed)
|
||||||
|
# - Auto-creates follow-up issues for pre-existing bugs flagged by reviewer
|
||||||
|
# - JSON output format with validation + retry
|
||||||
|
#
|
||||||
|
# Peek while running: cat /tmp/harb-review-status
|
||||||
|
# Watch log: tail -f ~/scripts/harb-review/review.log
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
# Load shared environment
|
||||||
|
source "$(dirname "$0")/../lib/env.sh"
|
||||||
|
|
||||||
|
|
||||||
|
PR_NUMBER="${1:?Usage: review-pr.sh <pr-number> [--force]}"
|
||||||
|
FORCE="${2:-}"
|
||||||
|
REPO="${CODEBERG_REPO}"
|
||||||
|
REPO_ROOT="/home/debian/harb"
|
||||||
|
|
||||||
|
# Bot account for posting reviews (separate user required for branch protection approvals)
|
||||||
|
API_BASE="${CODEBERG_API}"
|
||||||
|
LOCKFILE="/tmp/harb-review.lock"
|
||||||
|
STATUSFILE="/tmp/harb-review-status"
|
||||||
|
LOGDIR="${FACTORY_ROOT}/review"
|
||||||
|
LOGFILE="$LOGDIR/review.log"
|
||||||
|
MIN_MEM_MB=1500
|
||||||
|
MAX_DIFF=25000
|
||||||
|
MAX_ATTEMPTS=2
|
||||||
|
TMPDIR=$(mktemp -d)
|
||||||
|
|
||||||
|
log() {
|
||||||
|
printf '[%s] PR#%s %s\n' "$(date -u '+%Y-%m-%d %H:%M:%S UTC')" "$PR_NUMBER" "$*" >> "$LOGFILE"
|
||||||
|
}
|
||||||
|
|
||||||
|
status() {
|
||||||
|
printf '[%s] PR #%s: %s\n' "$(date -u '+%Y-%m-%d %H:%M:%S UTC')" "$PR_NUMBER" "$*" > "$STATUSFILE"
|
||||||
|
log "$*"
|
||||||
|
}
|
||||||
|
|
||||||
|
cleanup() {
|
||||||
|
rm -rf "$TMPDIR"
|
||||||
|
rm -f "$LOCKFILE" "$STATUSFILE"
|
||||||
|
}
|
||||||
|
trap cleanup EXIT
|
||||||
|
|
||||||
|
# Log rotation (100KB + 1 archive)
|
||||||
|
if [ -f "$LOGFILE" ] && [ "$(stat -c%s "$LOGFILE" 2>/dev/null || echo 0)" -gt 102400 ]; then
|
||||||
|
mv "$LOGFILE" "$LOGFILE.old"
|
||||||
|
log "Log rotated"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Memory guard
|
||||||
|
AVAIL_MB=$(awk '/MemAvailable/{printf "%d", $2/1024}' /proc/meminfo)
|
||||||
|
if [ "$AVAIL_MB" -lt "$MIN_MEM_MB" ]; then
|
||||||
|
log "SKIP: only ${AVAIL_MB}MB available (need ${MIN_MEM_MB}MB)"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Concurrency lock
|
||||||
|
if [ -f "$LOCKFILE" ]; then
|
||||||
|
LOCK_PID=$(cat "$LOCKFILE" 2>/dev/null || echo "")
|
||||||
|
if [ -n "$LOCK_PID" ] && kill -0 "$LOCK_PID" 2>/dev/null; then
|
||||||
|
log "SKIP: another review running (PID ${LOCK_PID})"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
log "Removing stale lock (PID ${LOCK_PID:-?})"
|
||||||
|
rm -f "$LOCKFILE"
|
||||||
|
fi
|
||||||
|
echo $$ > "$LOCKFILE"
|
||||||
|
|
||||||
|
# Fetch PR metadata
|
||||||
|
status "fetching metadata"
|
||||||
|
PR_JSON=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/pulls/${PR_NUMBER}")
|
||||||
|
|
||||||
|
PR_TITLE=$(echo "$PR_JSON" | jq -r '.title')
|
||||||
|
PR_BODY=$(echo "$PR_JSON" | jq -r '.body // ""')
|
||||||
|
PR_HEAD=$(echo "$PR_JSON" | jq -r '.head.ref')
|
||||||
|
PR_BASE=$(echo "$PR_JSON" | jq -r '.base.ref')
|
||||||
|
PR_SHA=$(echo "$PR_JSON" | jq -r '.head.sha')
|
||||||
|
PR_STATE=$(echo "$PR_JSON" | jq -r '.state')
|
||||||
|
|
||||||
|
log "${PR_TITLE} (${PR_HEAD}→${PR_BASE} ${PR_SHA:0:7})"
|
||||||
|
|
||||||
|
if [ "$PR_STATE" != "open" ]; then
|
||||||
|
log "SKIP: state=${PR_STATE}"
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
git worktree remove "/tmp/harb-review-${PR_NUMBER}" --force 2>/dev/null || true
|
||||||
|
rm -rf "/tmp/harb-review-${PR_NUMBER}" 2>/dev/null || true
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
status "checking CI"
|
||||||
|
CI_STATE=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/commits/${PR_SHA}/status" | jq -r '.state // "unknown"')
|
||||||
|
|
||||||
|
if [ "$CI_STATE" != "success" ]; then
|
||||||
|
log "SKIP: CI=${CI_STATE}"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Check for existing reviews ---
|
||||||
|
status "checking existing reviews"
|
||||||
|
ALL_COMMENTS=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/issues/${PR_NUMBER}/comments?limit=50")
|
||||||
|
|
||||||
|
# Check formal Codeberg reviews — skip if a non-stale review exists for this SHA
|
||||||
|
EXISTING=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/pulls/${PR_NUMBER}/reviews" | \
|
||||||
|
jq -r --arg sha "$PR_SHA" \
|
||||||
|
'[.[] | select(.commit_id == $sha) | select(.state != "COMMENT")] | length')
|
||||||
|
|
||||||
|
if [ "${EXISTING:-0}" -gt "0" ] && [ "$FORCE" != "--force" ]; then
|
||||||
|
log "SKIP: formal review exists for ${PR_SHA:0:7}"
|
||||||
|
exit 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Find previous review for re-review mode
|
||||||
|
PREV_REVIEW_JSON=$(echo "$ALL_COMMENTS" | \
|
||||||
|
jq -r --arg sha "$PR_SHA" \
|
||||||
|
'[.[] | select(.body | contains("<!-- reviewed:")) | select(.body | contains($sha) | not)] | last // empty')
|
||||||
|
|
||||||
|
PREV_REVIEW_BODY=""
|
||||||
|
PREV_REVIEW_SHA=""
|
||||||
|
IS_RE_REVIEW=false
|
||||||
|
|
||||||
|
if [ -n "$PREV_REVIEW_JSON" ] && [ "$PREV_REVIEW_JSON" != "null" ]; then
|
||||||
|
PREV_REVIEW_BODY=$(echo "$PREV_REVIEW_JSON" | jq -r '.body')
|
||||||
|
PREV_REVIEW_SHA=$(echo "$PREV_REVIEW_BODY" | grep -oP '<!-- reviewed: \K[a-f0-9]+' | head -1)
|
||||||
|
IS_RE_REVIEW=true
|
||||||
|
log "re-review mode: previous review at ${PREV_REVIEW_SHA:0:7}"
|
||||||
|
|
||||||
|
DEV_RESPONSE=$(echo "$ALL_COMMENTS" | \
|
||||||
|
jq -r '[.[] | select(.body | contains("<!-- dev-response:"))] | last // empty')
|
||||||
|
DEV_RESPONSE_BODY=""
|
||||||
|
if [ -n "$DEV_RESPONSE" ] && [ "$DEV_RESPONSE" != "null" ]; then
|
||||||
|
DEV_RESPONSE_BODY=$(echo "$DEV_RESPONSE" | jq -r '.body')
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Fetch diffs ---
|
||||||
|
status "fetching diff"
|
||||||
|
curl -s -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/pulls/${PR_NUMBER}.diff" > "${TMPDIR}/full.diff"
|
||||||
|
|
||||||
|
FULL_SIZE=$(stat -c%s "${TMPDIR}/full.diff" 2>/dev/null || echo 0)
|
||||||
|
DIFF=$(head -c "$MAX_DIFF" "${TMPDIR}/full.diff")
|
||||||
|
DIFF_TRUNCATED=false
|
||||||
|
if [ "$FULL_SIZE" -gt "$MAX_DIFF" ]; then
|
||||||
|
DIFF_TRUNCATED=true
|
||||||
|
log "diff truncated: ${FULL_SIZE} → ${MAX_DIFF} bytes"
|
||||||
|
fi
|
||||||
|
|
||||||
|
DIFF_STAT=$(echo "$DIFF" | grep -E '^\+\+\+ b/|^--- a/' | sed 's|^+++ b/||;s|^--- a/||' | grep -v '/dev/null' | sort -u)
|
||||||
|
ALL_FILES=$(grep -E '^\+\+\+ b/|^--- a/' "${TMPDIR}/full.diff" | sed 's|^+++ b/||;s|^--- a/||' | grep -v '/dev/null' | sort -u)
|
||||||
|
TRUNCATED_FILES=""
|
||||||
|
if [ "$DIFF_TRUNCATED" = true ]; then
|
||||||
|
TRUNCATED_FILES=$(comm -23 <(echo "$ALL_FILES") <(echo "$DIFF_STAT") | tr '\n' ', ' | sed 's/,$//')
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Fetch incremental diff for re-reviews
|
||||||
|
INCREMENTAL_DIFF=""
|
||||||
|
if [ "$IS_RE_REVIEW" = true ] && [ -n "$PREV_REVIEW_SHA" ]; then
|
||||||
|
status "fetching incremental diff (${PREV_REVIEW_SHA:0:7}..${PR_SHA:0:7})"
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
git fetch origin "$PR_HEAD" 2>/dev/null || true
|
||||||
|
INCREMENTAL_DIFF=$(git diff "${PREV_REVIEW_SHA}..${PR_SHA}" 2>/dev/null | head -c "$MAX_DIFF") || true
|
||||||
|
if [ -z "$INCREMENTAL_DIFF" ]; then
|
||||||
|
log "incremental diff empty (SHA not available locally?)"
|
||||||
|
IS_RE_REVIEW=false
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Checkout PR branch ---
|
||||||
|
status "checking out PR branch"
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
git fetch origin "$PR_HEAD" 2>/dev/null || true
|
||||||
|
REVIEW_WORKTREE="/tmp/harb-review-${PR_NUMBER}"
|
||||||
|
|
||||||
|
if [ -d "$REVIEW_WORKTREE" ]; then
|
||||||
|
cd "$REVIEW_WORKTREE"
|
||||||
|
git checkout --detach "${PR_SHA}" 2>/dev/null || {
|
||||||
|
cd "$REPO_ROOT"
|
||||||
|
git worktree remove "$REVIEW_WORKTREE" --force 2>/dev/null || true
|
||||||
|
rm -rf "$REVIEW_WORKTREE"
|
||||||
|
git worktree add "$REVIEW_WORKTREE" "${PR_SHA}" --detach 2>/dev/null
|
||||||
|
}
|
||||||
|
else
|
||||||
|
git worktree add "$REVIEW_WORKTREE" "${PR_SHA}" --detach 2>/dev/null
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Classify scope ---
|
||||||
|
HAS_CONTRACTS=false
|
||||||
|
HAS_FRONTEND=false
|
||||||
|
HAS_DOCS=false
|
||||||
|
HAS_INFRA=false
|
||||||
|
|
||||||
|
for f in $ALL_FILES; do
|
||||||
|
case "$f" in
|
||||||
|
onchain/*) HAS_CONTRACTS=true ;;
|
||||||
|
landing/*|web-app/*) HAS_FRONTEND=true ;;
|
||||||
|
docs/*|*.md) HAS_DOCS=true ;;
|
||||||
|
containers/*|.woodpecker/*|scripts/*|docker*|*.sh|*.yml) HAS_INFRA=true ;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
NEEDS_CLAIM_CHECK=false
|
||||||
|
NEEDS_UX_CHECK=false
|
||||||
|
if [ "$HAS_FRONTEND" = true ] || [ "$HAS_DOCS" = true ]; then NEEDS_CLAIM_CHECK=true; fi
|
||||||
|
if [ "$HAS_FRONTEND" = true ]; then NEEDS_UX_CHECK=true; fi
|
||||||
|
|
||||||
|
SCOPE_DESC=""
|
||||||
|
if [ "$HAS_CONTRACTS" = true ] && [ "$HAS_FRONTEND" = false ] && [ "$HAS_DOCS" = false ]; then
|
||||||
|
SCOPE_DESC="contracts-only"
|
||||||
|
elif [ "$HAS_FRONTEND" = true ] && [ "$HAS_CONTRACTS" = false ]; then
|
||||||
|
SCOPE_DESC="frontend-only"
|
||||||
|
elif [ "$HAS_DOCS" = true ] && [ "$HAS_CONTRACTS" = false ] && [ "$HAS_FRONTEND" = false ]; then
|
||||||
|
SCOPE_DESC="docs-only"
|
||||||
|
elif [ "$HAS_INFRA" = true ] && [ "$HAS_CONTRACTS" = false ] && [ "$HAS_FRONTEND" = false ] && [ "$HAS_DOCS" = false ]; then
|
||||||
|
SCOPE_DESC="infra-only"
|
||||||
|
else
|
||||||
|
SCOPE_DESC="mixed"
|
||||||
|
fi
|
||||||
|
log "scope: ${SCOPE_DESC} (contracts=${HAS_CONTRACTS} frontend=${HAS_FRONTEND} docs=${HAS_DOCS} infra=${HAS_INFRA})"
|
||||||
|
|
||||||
|
# --- Build JSON output schema instructions ---
|
||||||
|
# These are appended to EVERY prompt (fresh + re-review) so they're always at the end,
|
||||||
|
# closest to where claude generates output — resists context window forgetting.
|
||||||
|
|
||||||
|
JSON_SCHEMA_FRESH='You MUST respond with a single JSON object. No markdown, no commentary outside the JSON.
|
||||||
|
|
||||||
|
{
|
||||||
|
"sections": [
|
||||||
|
{
|
||||||
|
"title": "string — section heading (e.g. Code Review, Architecture Check)",
|
||||||
|
"findings": [
|
||||||
|
{
|
||||||
|
"severity": "bug | warning | nit | info",
|
||||||
|
"location": "file:line or file — where the issue is",
|
||||||
|
"description": "what is wrong and why"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"followups": [
|
||||||
|
{
|
||||||
|
"title": "string — one-line issue title",
|
||||||
|
"details": "string — what is wrong and where (pre-existing, not introduced by this PR)"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"verdict": "APPROVE | REQUEST_CHANGES | DISCUSS",
|
||||||
|
"verdict_reason": "string — one line explanation"
|
||||||
|
}'
|
||||||
|
|
||||||
|
JSON_SCHEMA_REREVIEW='You MUST respond with a single JSON object. No markdown, no commentary outside the JSON.
|
||||||
|
|
||||||
|
{
|
||||||
|
"previous_findings": [
|
||||||
|
{
|
||||||
|
"summary": "string — what was flagged",
|
||||||
|
"status": "fixed | not_fixed | partial",
|
||||||
|
"explanation": "string — how it was addressed or why not"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"new_issues": [
|
||||||
|
{
|
||||||
|
"severity": "bug | warning | nit | info",
|
||||||
|
"location": "file:line or file",
|
||||||
|
"description": "string"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"followups": [
|
||||||
|
{
|
||||||
|
"title": "string — one-line issue title",
|
||||||
|
"details": "string — pre-existing tech debt"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"verdict": "APPROVE | REQUEST_CHANGES | DISCUSS",
|
||||||
|
"verdict_reason": "string — one line"
|
||||||
|
}'
|
||||||
|
|
||||||
|
# --- Build prompt ---
|
||||||
|
status "building prompt"
|
||||||
|
cat > "${TMPDIR}/prompt.md" << PROMPT_EOF
|
||||||
|
# PR #${PR_NUMBER}: ${PR_TITLE}
|
||||||
|
|
||||||
|
## PR Description
|
||||||
|
${PR_BODY}
|
||||||
|
|
||||||
|
## Changed Files
|
||||||
|
${ALL_FILES}
|
||||||
|
|
||||||
|
## Full Repo Access
|
||||||
|
You are running in a checkout of the PR branch. You can read ANY file in the repo to verify
|
||||||
|
claims, check existing code, or understand context. Use this to avoid false positives —
|
||||||
|
if you're unsure whether something "already exists", read the file before flagging it.
|
||||||
|
|
||||||
|
Key docs available: docs/PRODUCT-TRUTH.md, docs/ARCHITECTURE.md, docs/UX-DECISIONS.md, docs/ENVIRONMENT.md
|
||||||
|
PROMPT_EOF
|
||||||
|
|
||||||
|
if [ "$DIFF_TRUNCATED" = true ]; then
|
||||||
|
cat >> "${TMPDIR}/prompt.md" << TRUNC_EOF
|
||||||
|
|
||||||
|
## Diff Truncated
|
||||||
|
The full diff is ${FULL_SIZE} bytes but was truncated to ${MAX_DIFF} bytes.
|
||||||
|
Files NOT included in the diff below: ${TRUNCATED_FILES:-unknown}
|
||||||
|
Do NOT flag missing files — they exist but were cut for size. Only review what you can see.
|
||||||
|
TRUNC_EOF
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "$IS_RE_REVIEW" = true ]; then
|
||||||
|
cat >> "${TMPDIR}/prompt.md" << REREVIEW_EOF
|
||||||
|
|
||||||
|
## This is a RE-REVIEW
|
||||||
|
|
||||||
|
A previous review at ${PREV_REVIEW_SHA:0:7} requested changes. The developer has pushed fixes.
|
||||||
|
|
||||||
|
### Previous Review
|
||||||
|
${PREV_REVIEW_BODY}
|
||||||
|
REREVIEW_EOF
|
||||||
|
|
||||||
|
if [ -n "$DEV_RESPONSE_BODY" ]; then
|
||||||
|
cat >> "${TMPDIR}/prompt.md" << DEVRESP_EOF
|
||||||
|
|
||||||
|
### Developer's Response
|
||||||
|
${DEV_RESPONSE_BODY}
|
||||||
|
DEVRESP_EOF
|
||||||
|
fi
|
||||||
|
|
||||||
|
cat >> "${TMPDIR}/prompt.md" << INCR_EOF
|
||||||
|
|
||||||
|
### Incremental Diff (${PREV_REVIEW_SHA:0:7}..${PR_SHA:0:7})
|
||||||
|
\`\`\`diff
|
||||||
|
${INCREMENTAL_DIFF}
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
### Full Diff (master..${PR_SHA:0:7})
|
||||||
|
\`\`\`diff
|
||||||
|
${DIFF}
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
## Your Task
|
||||||
|
Review the incremental diff. For each finding in the previous review, check if it was addressed.
|
||||||
|
Then check for new issues introduced by the fix.
|
||||||
|
|
||||||
|
## OUTPUT FORMAT — MANDATORY
|
||||||
|
${JSON_SCHEMA_REREVIEW}
|
||||||
|
INCR_EOF
|
||||||
|
|
||||||
|
else
|
||||||
|
# Build task description based on scope
|
||||||
|
TASK_DESC="Review this ${SCOPE_DESC} PR."
|
||||||
|
if [ "$NEEDS_CLAIM_CHECK" = true ]; then
|
||||||
|
TASK_DESC="${TASK_DESC} Check all user-facing claims against docs/PRODUCT-TRUTH.md."
|
||||||
|
fi
|
||||||
|
TASK_DESC="${TASK_DESC} Check for bugs, logic errors, missing edge cases, broken imports."
|
||||||
|
TASK_DESC="${TASK_DESC} Verify architecture patterns match docs/ARCHITECTURE.md."
|
||||||
|
if [ "$NEEDS_UX_CHECK" = true ]; then
|
||||||
|
TASK_DESC="${TASK_DESC} Check UX/messaging against docs/UX-DECISIONS.md."
|
||||||
|
fi
|
||||||
|
|
||||||
|
cat >> "${TMPDIR}/prompt.md" << DIFF_EOF
|
||||||
|
|
||||||
|
## Diff
|
||||||
|
\`\`\`diff
|
||||||
|
${DIFF}
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
## Your Task
|
||||||
|
${TASK_DESC}
|
||||||
|
|
||||||
|
## OUTPUT FORMAT — MANDATORY
|
||||||
|
${JSON_SCHEMA_FRESH}
|
||||||
|
DIFF_EOF
|
||||||
|
fi
|
||||||
|
|
||||||
|
PROMPT_SIZE=$(stat -c%s "${TMPDIR}/prompt.md")
|
||||||
|
log "Prompt: ${PROMPT_SIZE} bytes (re-review: ${IS_RE_REVIEW})"
|
||||||
|
|
||||||
|
# --- Run claude with retry on invalid JSON ---
|
||||||
|
CONTINUE_FLAG=""
|
||||||
|
if [ "$IS_RE_REVIEW" = true ]; then
|
||||||
|
CONTINUE_FLAG="-c"
|
||||||
|
fi
|
||||||
|
|
||||||
|
REVIEW_JSON=""
|
||||||
|
for attempt in $(seq 1 "$MAX_ATTEMPTS"); do
|
||||||
|
status "running claude attempt ${attempt}/${MAX_ATTEMPTS}"
|
||||||
|
SECONDS=0
|
||||||
|
|
||||||
|
RAW_OUTPUT=$(cd "$REVIEW_WORKTREE" && claude -p $CONTINUE_FLAG \
|
||||||
|
--model sonnet \
|
||||||
|
--dangerously-skip-permissions \
|
||||||
|
--output-format text \
|
||||||
|
< "${TMPDIR}/prompt.md" 2>"${TMPDIR}/claude-stderr.log") || true
|
||||||
|
|
||||||
|
ELAPSED=$SECONDS
|
||||||
|
|
||||||
|
if [ -z "$RAW_OUTPUT" ]; then
|
||||||
|
log "attempt ${attempt}: empty output after ${ELAPSED}s"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
RAW_SIZE=$(printf '%s' "$RAW_OUTPUT" | wc -c)
|
||||||
|
log "attempt ${attempt}: ${RAW_SIZE} bytes in ${ELAPSED}s"
|
||||||
|
|
||||||
|
# Extract JSON — claude might wrap it in ```json ... ``` or add preamble
|
||||||
|
# Try raw first, then extract from code fence
|
||||||
|
if printf '%s' "$RAW_OUTPUT" | jq -e '.verdict' > /dev/null 2>&1; then
|
||||||
|
REVIEW_JSON="$RAW_OUTPUT"
|
||||||
|
else
|
||||||
|
# Try extracting from code fence
|
||||||
|
EXTRACTED=$(printf '%s' "$RAW_OUTPUT" | sed -n '/^```json/,/^```$/p' | sed '1d;$d')
|
||||||
|
if [ -n "$EXTRACTED" ] && printf '%s' "$EXTRACTED" | jq -e '.verdict' > /dev/null 2>&1; then
|
||||||
|
REVIEW_JSON="$EXTRACTED"
|
||||||
|
else
|
||||||
|
# Try extracting first { ... } block
|
||||||
|
EXTRACTED=$(printf '%s' "$RAW_OUTPUT" | sed -n '/^{/,/^}/p')
|
||||||
|
if [ -n "$EXTRACTED" ] && printf '%s' "$EXTRACTED" | jq -e '.verdict' > /dev/null 2>&1; then
|
||||||
|
REVIEW_JSON="$EXTRACTED"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -n "$REVIEW_JSON" ]; then
|
||||||
|
# Validate required fields
|
||||||
|
VERDICT=$(printf '%s' "$REVIEW_JSON" | jq -r '.verdict // empty')
|
||||||
|
if [ -n "$VERDICT" ]; then
|
||||||
|
log "attempt ${attempt}: valid JSON, verdict=${VERDICT}"
|
||||||
|
break
|
||||||
|
else
|
||||||
|
log "attempt ${attempt}: JSON missing verdict"
|
||||||
|
REVIEW_JSON=""
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log "attempt ${attempt}: no valid JSON found in output"
|
||||||
|
# Save raw output for debugging
|
||||||
|
printf '%s' "$RAW_OUTPUT" > "${TMPDIR}/raw-attempt-${attempt}.txt"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# For retry, add explicit correction to prompt
|
||||||
|
if [ "$attempt" -lt "$MAX_ATTEMPTS" ]; then
|
||||||
|
cat >> "${TMPDIR}/prompt.md" << RETRY_EOF
|
||||||
|
|
||||||
|
## RETRY — Your previous response was not valid JSON
|
||||||
|
You MUST output a single JSON object with a "verdict" field. No markdown wrapping. No prose.
|
||||||
|
Start your response with { and end with }.
|
||||||
|
RETRY_EOF
|
||||||
|
log "appended retry instruction to prompt"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
# --- Handle failure: post error comment ---
|
||||||
|
if [ -z "$REVIEW_JSON" ]; then
|
||||||
|
log "ERROR: no valid JSON after ${MAX_ATTEMPTS} attempts"
|
||||||
|
|
||||||
|
ERROR_BODY="## 🤖 AI Review — Error
|
||||||
|
<!-- review-error: ${PR_SHA} -->
|
||||||
|
|
||||||
|
⚠️ Review failed: could not produce structured output after ${MAX_ATTEMPTS} attempts.
|
||||||
|
|
||||||
|
A maintainer should review this PR manually, or re-trigger with \`--force\`.
|
||||||
|
|
||||||
|
---
|
||||||
|
*Failed at \`${PR_SHA:0:7}\`*"
|
||||||
|
|
||||||
|
printf '%s' "$ERROR_BODY" > "${TMPDIR}/comment-body.txt"
|
||||||
|
jq -Rs '{body: .}' < "${TMPDIR}/comment-body.txt" > "${TMPDIR}/comment.json"
|
||||||
|
|
||||||
|
curl -s -o /dev/null -w "%{http_code}" \
|
||||||
|
-X POST \
|
||||||
|
-H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API_BASE}/issues/${PR_NUMBER}/comments" \
|
||||||
|
--data-binary @"${TMPDIR}/comment.json" > /dev/null
|
||||||
|
|
||||||
|
# Save raw outputs for debugging
|
||||||
|
for f in "${TMPDIR}"/raw-attempt-*.txt; do
|
||||||
|
[ -f "$f" ] && cp "$f" "${LOGDIR}/review-pr${PR_NUMBER}-$(basename "$f")"
|
||||||
|
done
|
||||||
|
|
||||||
|
openclaw system event \
|
||||||
|
--text "⚠️ PR #${PR_NUMBER} review failed — no valid JSON output" \
|
||||||
|
--mode now 2>/dev/null || true
|
||||||
|
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Render JSON → Markdown ---
|
||||||
|
VERDICT=$(printf '%s' "$REVIEW_JSON" | jq -r '.verdict')
|
||||||
|
VERDICT_REASON=$(printf '%s' "$REVIEW_JSON" | jq -r '.verdict_reason // ""')
|
||||||
|
|
||||||
|
render_markdown() {
|
||||||
|
local json="$1"
|
||||||
|
local md=""
|
||||||
|
|
||||||
|
if [ "$IS_RE_REVIEW" = true ]; then
|
||||||
|
# Re-review format
|
||||||
|
local prev_count
|
||||||
|
prev_count=$(printf '%s' "$json" | jq '.previous_findings | length')
|
||||||
|
|
||||||
|
if [ "$prev_count" -gt 0 ]; then
|
||||||
|
md+="### Previous Findings"$'\n'
|
||||||
|
while IFS= read -r finding; do
|
||||||
|
local summary status explanation
|
||||||
|
summary=$(printf '%s' "$finding" | jq -r '.summary')
|
||||||
|
status=$(printf '%s' "$finding" | jq -r '.status')
|
||||||
|
explanation=$(printf '%s' "$finding" | jq -r '.explanation')
|
||||||
|
|
||||||
|
local icon="❓"
|
||||||
|
case "$status" in
|
||||||
|
fixed) icon="✅" ;;
|
||||||
|
not_fixed) icon="❌" ;;
|
||||||
|
partial) icon="⚠️" ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
md+="- ${summary} → ${icon} ${explanation}"$'\n'
|
||||||
|
done < <(printf '%s' "$json" | jq -c '.previous_findings[]')
|
||||||
|
md+=$'\n'
|
||||||
|
fi
|
||||||
|
|
||||||
|
local new_count
|
||||||
|
new_count=$(printf '%s' "$json" | jq '.new_issues | length')
|
||||||
|
if [ "$new_count" -gt 0 ]; then
|
||||||
|
md+="### New Issues"$'\n'
|
||||||
|
while IFS= read -r issue; do
|
||||||
|
local sev loc desc
|
||||||
|
sev=$(printf '%s' "$issue" | jq -r '.severity')
|
||||||
|
loc=$(printf '%s' "$issue" | jq -r '.location')
|
||||||
|
desc=$(printf '%s' "$issue" | jq -r '.description')
|
||||||
|
|
||||||
|
local icon="ℹ️"
|
||||||
|
case "$sev" in
|
||||||
|
bug) icon="🐛" ;;
|
||||||
|
warning) icon="⚠️" ;;
|
||||||
|
nit) icon="💅" ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
md+="- ${icon} **${sev}** \`${loc}\`: ${desc}"$'\n'
|
||||||
|
done < <(printf '%s' "$json" | jq -c '.new_issues[]')
|
||||||
|
md+=$'\n'
|
||||||
|
fi
|
||||||
|
|
||||||
|
else
|
||||||
|
# Fresh review format
|
||||||
|
while IFS= read -r section; do
|
||||||
|
local title
|
||||||
|
title=$(printf '%s' "$section" | jq -r '.title')
|
||||||
|
local finding_count
|
||||||
|
finding_count=$(printf '%s' "$section" | jq '.findings | length')
|
||||||
|
|
||||||
|
md+="### ${title}"$'\n'
|
||||||
|
|
||||||
|
if [ "$finding_count" -eq 0 ]; then
|
||||||
|
md+="No issues found."$'\n'$'\n'
|
||||||
|
else
|
||||||
|
while IFS= read -r finding; do
|
||||||
|
local sev loc desc
|
||||||
|
sev=$(printf '%s' "$finding" | jq -r '.severity')
|
||||||
|
loc=$(printf '%s' "$finding" | jq -r '.location')
|
||||||
|
desc=$(printf '%s' "$finding" | jq -r '.description')
|
||||||
|
|
||||||
|
local icon="ℹ️"
|
||||||
|
case "$sev" in
|
||||||
|
bug) icon="🐛" ;;
|
||||||
|
warning) icon="⚠️" ;;
|
||||||
|
nit) icon="💅" ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
md+="- ${icon} **${sev}** \`${loc}\`: ${desc}"$'\n'
|
||||||
|
done < <(printf '%s' "$section" | jq -c '.findings[]')
|
||||||
|
md+=$'\n'
|
||||||
|
fi
|
||||||
|
done < <(printf '%s' "$json" | jq -c '.sections[]')
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Follow-ups
|
||||||
|
local followup_count
|
||||||
|
followup_count=$(printf '%s' "$json" | jq '.followups | length')
|
||||||
|
if [ "$followup_count" -gt 0 ]; then
|
||||||
|
md+="### Follow-up Issues"$'\n'
|
||||||
|
while IFS= read -r fu; do
|
||||||
|
local fu_title fu_details
|
||||||
|
fu_title=$(printf '%s' "$fu" | jq -r '.title')
|
||||||
|
fu_details=$(printf '%s' "$fu" | jq -r '.details')
|
||||||
|
md+="- **${fu_title}**: ${fu_details}"$'\n'
|
||||||
|
done < <(printf '%s' "$json" | jq -c '.followups[]')
|
||||||
|
md+=$'\n'
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Verdict
|
||||||
|
md+="### Verdict"$'\n'
|
||||||
|
md+="**${VERDICT}** — ${VERDICT_REASON}"$'\n'
|
||||||
|
|
||||||
|
printf '%s' "$md"
|
||||||
|
}
|
||||||
|
|
||||||
|
REVIEW_MD=$(render_markdown "$REVIEW_JSON")
|
||||||
|
|
||||||
|
# --- Post review to Codeberg ---
|
||||||
|
status "posting to Codeberg"
|
||||||
|
|
||||||
|
REVIEW_TYPE="Review"
|
||||||
|
if [ "$IS_RE_REVIEW" = true ]; then
|
||||||
|
ROUND=$(($(echo "$ALL_COMMENTS" | jq '[.[] | select(.body | contains("<!-- reviewed:"))] | length') + 1))
|
||||||
|
REVIEW_TYPE="Re-review (round ${ROUND})"
|
||||||
|
fi
|
||||||
|
|
||||||
|
COMMENT_BODY="## 🤖 AI ${REVIEW_TYPE}
|
||||||
|
<!-- reviewed: ${PR_SHA} -->
|
||||||
|
|
||||||
|
${REVIEW_MD}
|
||||||
|
|
||||||
|
---
|
||||||
|
*Reviewed at \`${PR_SHA:0:7}\`$(if [ "$IS_RE_REVIEW" = true ]; then echo " · Previous: \`${PREV_REVIEW_SHA:0:7}\`"; fi) · [PRODUCT-TRUTH.md](../docs/PRODUCT-TRUTH.md) · [ARCHITECTURE.md](../docs/ARCHITECTURE.md)*"
|
||||||
|
|
||||||
|
printf '%s' "$COMMENT_BODY" > "${TMPDIR}/comment-body.txt"
|
||||||
|
jq -Rs '{body: .}' < "${TMPDIR}/comment-body.txt" > "${TMPDIR}/comment.json"
|
||||||
|
|
||||||
|
POST_CODE=$(curl -s -o "${TMPDIR}/post-response.txt" -w "%{http_code}" \
|
||||||
|
-X POST \
|
||||||
|
-H "Authorization: token ${REVIEW_BOT_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API_BASE}/issues/${PR_NUMBER}/comments" \
|
||||||
|
--data-binary @"${TMPDIR}/comment.json")
|
||||||
|
|
||||||
|
if [ "${POST_CODE}" = "201" ]; then
|
||||||
|
log "POSTED comment to Codeberg (as review_bot)"
|
||||||
|
|
||||||
|
# Submit formal Codeberg review (required for branch protection approval)
|
||||||
|
REVIEW_EVENT="COMMENT"
|
||||||
|
case "$VERDICT" in
|
||||||
|
APPROVE) REVIEW_EVENT="APPROVED" ;;
|
||||||
|
REQUEST_CHANGES|DISCUSS) REVIEW_EVENT="REQUEST_CHANGES" ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
FORMAL_BODY="AI ${REVIEW_TYPE}: **${VERDICT}** — ${VERDICT_REASON}"
|
||||||
|
jq -n --arg body "$FORMAL_BODY" --arg event "$REVIEW_EVENT" --arg sha "$PR_SHA" \
|
||||||
|
'{body: $body, event: $event, commit_id: $sha}' > "${TMPDIR}/formal-review.json"
|
||||||
|
|
||||||
|
REVIEW_CODE=$(curl -s -o "${TMPDIR}/review-response.txt" -w "%{http_code}" \
|
||||||
|
-X POST \
|
||||||
|
-H "Authorization: token ${REVIEW_BOT_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API_BASE}/pulls/${PR_NUMBER}/reviews" \
|
||||||
|
--data-binary @"${TMPDIR}/formal-review.json")
|
||||||
|
|
||||||
|
if [ "${REVIEW_CODE}" = "200" ]; then
|
||||||
|
log "SUBMITTED formal ${REVIEW_EVENT} review"
|
||||||
|
else
|
||||||
|
log "WARNING: formal review failed (HTTP ${REVIEW_CODE}): $(head -c 200 "${TMPDIR}/review-response.txt" 2>/dev/null)"
|
||||||
|
# Non-fatal — the comment is already posted
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log "ERROR: Codeberg HTTP ${POST_CODE}: $(head -c 200 "${TMPDIR}/post-response.txt" 2>/dev/null)"
|
||||||
|
echo "$REVIEW_MD" > "${LOGDIR}/review-pr${PR_NUMBER}-${PR_SHA:0:7}.md"
|
||||||
|
log "Review saved to ${LOGDIR}/review-pr${PR_NUMBER}-${PR_SHA:0:7}.md"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Auto-create follow-up issues from JSON ---
|
||||||
|
FOLLOWUP_COUNT=$(printf '%s' "$REVIEW_JSON" | jq '.followups | length')
|
||||||
|
if [ "$FOLLOWUP_COUNT" -gt 0 ]; then
|
||||||
|
log "processing ${FOLLOWUP_COUNT} follow-up issues"
|
||||||
|
|
||||||
|
TECH_DEBT_ID=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/labels" | jq -r '.[] | select(.name=="tech-debt") | .id')
|
||||||
|
|
||||||
|
if [ -z "$TECH_DEBT_ID" ]; then
|
||||||
|
TECH_DEBT_ID=$(curl -sf -X POST \
|
||||||
|
-H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API_BASE}/labels" \
|
||||||
|
-d '{"name":"tech-debt","color":"#6B7280","description":"Pre-existing tech debt flagged by AI review"}' | jq -r '.id')
|
||||||
|
fi
|
||||||
|
|
||||||
|
CREATED_COUNT=0
|
||||||
|
while IFS= read -r fu; do
|
||||||
|
FU_TITLE=$(printf '%s' "$fu" | jq -r '.title')
|
||||||
|
FU_DETAILS=$(printf '%s' "$fu" | jq -r '.details')
|
||||||
|
|
||||||
|
# Check for duplicate
|
||||||
|
EXISTING=$(curl -sf -H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
"${API_BASE}/issues?state=open&labels=tech-debt&limit=50" | \
|
||||||
|
jq -r --arg t "$FU_TITLE" '[.[] | select(.title == $t)] | length')
|
||||||
|
|
||||||
|
if [ "${EXISTING:-0}" -gt 0 ]; then
|
||||||
|
log "skip duplicate follow-up: ${FU_TITLE}"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
|
||||||
|
ISSUE_BODY="Flagged by AI reviewer in PR #${PR_NUMBER}.
|
||||||
|
|
||||||
|
## Problem
|
||||||
|
|
||||||
|
${FU_DETAILS}
|
||||||
|
|
||||||
|
---
|
||||||
|
*Auto-created from AI review of PR #${PR_NUMBER}*"
|
||||||
|
|
||||||
|
printf '%s' "$ISSUE_BODY" > "${TMPDIR}/followup-body.txt"
|
||||||
|
jq -n \
|
||||||
|
--arg title "$FU_TITLE" \
|
||||||
|
--rawfile body "${TMPDIR}/followup-body.txt" \
|
||||||
|
--argjson labels "[$TECH_DEBT_ID]" \
|
||||||
|
'{title: $title, body: $body, labels: $labels}' > "${TMPDIR}/followup-issue.json"
|
||||||
|
|
||||||
|
CREATED=$(curl -sf -X POST \
|
||||||
|
-H "Authorization: token ${CODEBERG_TOKEN}" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
"${API_BASE}/issues" \
|
||||||
|
--data-binary @"${TMPDIR}/followup-issue.json" | jq -r '.number // empty')
|
||||||
|
|
||||||
|
if [ -n "$CREATED" ]; then
|
||||||
|
log "created follow-up issue #${CREATED}: ${FU_TITLE}"
|
||||||
|
CREATED_COUNT=$((CREATED_COUNT + 1))
|
||||||
|
fi
|
||||||
|
done < <(printf '%s' "$REVIEW_JSON" | jq -c '.followups[]')
|
||||||
|
|
||||||
|
log "created ${CREATED_COUNT} follow-up issues total"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Notify OpenClaw ---
|
||||||
|
openclaw system event \
|
||||||
|
--text "🤖 PR #${PR_NUMBER} ${REVIEW_TYPE}: ${VERDICT} — ${PR_TITLE}" \
|
||||||
|
--mode now 2>/dev/null || true
|
||||||
|
|
||||||
|
log "DONE: ${VERDICT} (${ELAPSED}s, re-review: ${IS_RE_REVIEW})"
|
||||||
Loading…
Add table
Add a link
Reference in a new issue