OpenCode refactor and mission tracking fixes (#14)
* Fix missions staying Active after completion with OpenCode backend - Add TerminalReason::Completed variant for successful task completion - Set terminal_reason in OpenCodeAgent on success to trigger auto-complete - Update control.rs to explicitly handle Completed terminal reason - Update CLAUDE.md with OpenCode backend documentation * Improve iOS dashboard UI polish - Remove harsh input field border, use ultraThinMaterial background with subtle focus glow - Clean up model selector pills: remove ugly truncated mission IDs, increase padding - Remove agent working indicator border for cleaner look - Increase input area bottom padding for better thumb reach * Add real-time event streaming for OpenCode backend - Add SSE streaming support to OpenCodeClient via /event endpoint - Parse and forward OpenCode events (thinking, tool_call, tool_result) - Update OpenCodeAgent to consume stream and forward to control channel - Add fallback to blocking mode if SSE connection fails This enables live UI updates in the dashboard when using OpenCode backend. * Fix running mission tracking to use actual executing mission ID Track the mission ID that the main `running` task is actually working on separately from `current_mission`, which can change when the user creates a new mission. This ensures ListRunning and GracefulShutdown correctly identify which mission is being executed. * Add MCP server for desktop tools and Playwright integration - Create desktop-mcp binary that exposes i3/Xvfb desktop automation tools as an MCP server for use with OpenCode backend - Add opencode.json with both desktop and Playwright MCP configurations - Update deployment command to include desktop-mcp binary - Document available MCP tools in CLAUDE.md Desktop tools: start_session, stop_session, screenshot, type, click, mouse_move, scroll, i3_command, get_text * Document SSH key and desktop-mcp binary in production section - Add ~/.ssh/cursor as the SSH key for production access - Add desktop-mcp binary location to production table * Emphasize bun usage and add gitignore entries - Add clear instructions to ALWAYS use bun, never npm for dashboard - Gitignore .playwright-mcp/ directory (local MCP data) - Gitignore dashboard/package-lock.json (we use bun.lockb) * Add mission delete and cleanup features to web and iOS dashboards Backend (Rust): - Add delete_mission() and delete_empty_untitled_missions() to supabase.rs - Add DELETE /api/control/missions/:id endpoint with running mission guard - Add POST /api/control/missions/cleanup endpoint for bulk cleanup Web Dashboard (Next.js): - Add deleteMission() and cleanupEmptyMissions() API functions - Add delete button (trash icon) on hover for each mission row - Add "Cleanup Empty" button with sparkles icon in filters area - Fix analytics to compute stats from missions/runs data instead of broken /api/stats iOS Dashboard (Swift): - Add deleteMission() and cleanupEmptyMissions() to APIService - Add delete() HTTP helper method - Add swipe-to-delete on mission rows (disabled for active missions) - Add "Cleanup" button with sparkles icon and progress indicator - Add success banner with auto-dismiss after cleanup * Fix CancelMission and MCP notification parsing bugs - CancelMission now uses running_mission_id instead of current_mission to correctly identify the executing mission (fixes race condition when user creates new mission while another is running) - MCP server JsonRpcRequest.id field now has #[serde(default)] to handle JSON-RPC 2.0 notifications which don't have an id field * Fix running mission tracking bugs - delete_mission: Query control actor for actual running missions instead of using always-empty running_missions list - cleanup_empty_missions: Exclude running missions from cleanup to prevent deleting missions mid-execution - get_parallel_config: Query control actor for accurate running count - Task completion: Save running_mission_id before clearing and use it for persist and auto-complete (fixes race when user creates new mission while task is running) All endpoints now use ControlCommand::ListRunning to get accurate running state from the control actor loop. * Fix bugbot issues: analytics cost, browser cleanup, title truncation, history append - Add get_total_cost_cents() to supabase.rs for aggregating all run costs - Update /api/stats endpoint to return actual total cost from database - Fix analytics page to use stats endpoint for total cost (not limited to 100 runs) - Fix desktop_mcp.rs to save browser_pid to session file after launch - Fix mission title truncation to use safe_truncate_index and append "..." - Fix mission history to append to existing DB history instead of replacing (prevents data loss when CreateMission is called during task execution) * Fix history context contamination and cumulative thinking content - Only push to local history if completed mission matches current mission, preventing old mission exchanges from contaminating new mission context - Accumulate thinking content across iterations so frontend replacement shows all thinking, matching OpenCode backend behavior * Fix MCP notifications, orphaned processes, and shutdown persistence - MCP server no longer sends responses to JSON-RPC notifications (per spec) - Clean up Xvfb/i3/Chromium processes on partial session startup failure - Graceful shutdown only persists history if running mission matches current * Fix partial field selection deserialization in cleanup endpoint Use PartialMission struct for partial field queries to avoid deserialization failure when DbMission's required fields are missing. * Clarify analytics success rate measures missions not tasks Update labels to "Mission Success Rate" and "X missions completed" to make it clear the metric is mission-level, not task-level.
This commit is contained in:
@@ -23,14 +23,19 @@ cargo test # Run tests
|
||||
cargo fmt # Format code
|
||||
cargo clippy # Lint
|
||||
|
||||
# Dashboard (uses Bun, NOT npm)
|
||||
# Dashboard (uses Bun, NOT npm/yarn/pnpm)
|
||||
cd dashboard
|
||||
bun install # Install deps
|
||||
bun install # Install deps (NEVER use npm install)
|
||||
bun dev # Dev server (port 3001)
|
||||
bun run build # Production build
|
||||
|
||||
# IMPORTANT: Always use bun for dashboard, never npm
|
||||
# - bun install (not npm install)
|
||||
# - bun add <pkg> (not npm install <pkg>)
|
||||
# - bun run <script> (not npm run <script>)
|
||||
|
||||
# Deployment
|
||||
ssh root@95.216.112.253 'cd /root/open_agent && git pull && cargo build --release && cp target/release/open_agent /usr/local/bin/ && systemctl restart open_agent'
|
||||
ssh root@95.216.112.253 'cd /root/open_agent && git pull && cargo build --release && cp target/release/open_agent /usr/local/bin/ && cp target/release/desktop-mcp /usr/local/bin/ && systemctl restart open_agent'
|
||||
```
|
||||
|
||||
## Architecture
|
||||
@@ -87,6 +92,35 @@ OPENCODE_PERMISSIVE=true
|
||||
Dashboard → Open Agent API → OpenCode Server → Anthropic API (Claude Max)
|
||||
```
|
||||
|
||||
**Desktop Tools with OpenCode:**
|
||||
To enable desktop tools (i3, Xvfb, screenshots) when using the OpenCode backend:
|
||||
|
||||
1. Build the MCP server: `cargo build --release --bin desktop-mcp`
|
||||
2. Ensure `opencode.json` is in the project root with the desktop MCP config
|
||||
3. OpenCode will automatically load the tools from the MCP server
|
||||
|
||||
The `opencode.json` configures MCP servers for desktop and browser automation:
|
||||
```json
|
||||
{
|
||||
"mcp": {
|
||||
"desktop": {
|
||||
"type": "local",
|
||||
"command": ["./target/release/desktop-mcp"],
|
||||
"enabled": true
|
||||
},
|
||||
"playwright": {
|
||||
"type": "local",
|
||||
"command": ["npx", "@playwright/mcp@latest"],
|
||||
"enabled": true
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Available MCP Tools:**
|
||||
- **Desktop tools** (i3/Xvfb): `desktop_start_session`, `desktop_screenshot`, `desktop_click`, `desktop_type`, `desktop_i3_command`, etc.
|
||||
- **Playwright tools**: `browser_navigate`, `browser_snapshot`, `browser_click`, `browser_type`, `browser_screenshot`, etc.
|
||||
|
||||
## Model Preferences
|
||||
|
||||
**With OpenCode backend:** Use Claude models via your Claude Max subscription.
|
||||
@@ -205,13 +239,16 @@ pub fn do_thing() -> Result<T, MyError> {
|
||||
| Property | Value |
|
||||
|----------|-------|
|
||||
| Host | `95.216.112.253` |
|
||||
| SSH | `ssh root@95.216.112.253` |
|
||||
| SSH | `ssh -i ~/.ssh/cursor root@95.216.112.253` |
|
||||
| Backend URL | `https://agent-backend.thomas.md` |
|
||||
| Dashboard URL | `https://agent.thomas.md` |
|
||||
| Binary | `/usr/local/bin/open_agent` |
|
||||
| Desktop MCP | `/usr/local/bin/desktop-mcp` |
|
||||
| Env file | `/etc/open_agent/open_agent.env` |
|
||||
| Service | `systemctl status open_agent` |
|
||||
|
||||
**SSH Key:** Use `~/.ssh/cursor` key for production server access.
|
||||
|
||||
## Adding New Components
|
||||
|
||||
### New API Endpoint
|
||||
|
||||
6
.gitignore
vendored
6
.gitignore
vendored
@@ -34,3 +34,9 @@ lmarena_leaderboard.json
|
||||
# OS files
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Playwright MCP local data
|
||||
.playwright-mcp/
|
||||
|
||||
# npm lockfile (we use bun)
|
||||
dashboard/package-lock.json
|
||||
|
||||
@@ -55,5 +55,13 @@ portable-pty = "0.9"
|
||||
tokio-util = { version = "0.7", features = ["io"] }
|
||||
chromiumoxide = { version = "0.8.0", features = ["tokio-runtime"] }
|
||||
|
||||
[[bin]]
|
||||
name = "open_agent"
|
||||
path = "src/main.rs"
|
||||
|
||||
[[bin]]
|
||||
name = "desktop-mcp"
|
||||
path = "src/bin/desktop_mcp.rs"
|
||||
|
||||
[dev-dependencies]
|
||||
tokio-test = "0.4"
|
||||
|
||||
@@ -3,10 +3,9 @@
|
||||
import { useEffect, useState, useMemo } from "react";
|
||||
import { toast } from "sonner";
|
||||
import {
|
||||
getStats,
|
||||
listMissions,
|
||||
listRuns,
|
||||
type StatsResponse,
|
||||
getStats,
|
||||
type Mission,
|
||||
type Run,
|
||||
} from "@/lib/api";
|
||||
@@ -37,23 +36,23 @@ interface StatusBreakdown {
|
||||
}
|
||||
|
||||
export default function AnalyticsPage() {
|
||||
const [stats, setStats] = useState<StatsResponse | null>(null);
|
||||
const [missions, setMissions] = useState<Mission[]>([]);
|
||||
const [runs, setRuns] = useState<Run[]>([]);
|
||||
const [totalCostCents, setTotalCostCents] = useState(0);
|
||||
const [loading, setLoading] = useState(true);
|
||||
const [timeRange, setTimeRange] = useState<"7d" | "30d" | "all">("7d");
|
||||
|
||||
useEffect(() => {
|
||||
async function fetchData() {
|
||||
try {
|
||||
const [statsData, missionsData, runsData] = await Promise.all([
|
||||
getStats(),
|
||||
const [missionsData, runsData, statsData] = await Promise.all([
|
||||
listMissions(),
|
||||
listRuns(100, 0),
|
||||
getStats(),
|
||||
]);
|
||||
setStats(statsData);
|
||||
setMissions(missionsData);
|
||||
setRuns(runsData.runs);
|
||||
setTotalCostCents(statsData.total_cost_cents);
|
||||
} catch (err) {
|
||||
console.error("Failed to fetch analytics:", err);
|
||||
toast.error("Failed to load analytics");
|
||||
@@ -126,6 +125,16 @@ export default function AnalyticsPage() {
|
||||
return totalCost / runs.length;
|
||||
}, [runs]);
|
||||
|
||||
// Calculate mission stats from actual mission data
|
||||
const missionStats = useMemo(() => {
|
||||
const completed = missions.filter(m => m.status === "completed").length;
|
||||
const failed = missions.filter(m => m.status === "failed" || m.status === "not_feasible").length;
|
||||
const finished = completed + failed;
|
||||
const successRate = finished > 0 ? completed / finished : 1;
|
||||
// Use totalCostCents from stats API (includes ALL runs, not just first 100)
|
||||
return { completed, failed, successRate, totalCost: totalCostCents };
|
||||
}, [missions, totalCostCents]);
|
||||
|
||||
// Calculate max single day cost
|
||||
const maxDayCost = useMemo(() => {
|
||||
return Math.max(...costByDay.map((d) => d.cost), 1);
|
||||
@@ -193,7 +202,7 @@ export default function AnalyticsPage() {
|
||||
<span className="text-xs text-white/50">Total Spent</span>
|
||||
</div>
|
||||
<div className="text-2xl font-semibold text-white">
|
||||
{formatCents(stats?.total_cost_cents ?? 0)}
|
||||
{formatCents(missionStats.totalCost)}
|
||||
</div>
|
||||
<div className="text-xs text-white/40 mt-1">
|
||||
{formatCents(periodTotalCost)} in selected period
|
||||
@@ -229,13 +238,13 @@ export default function AnalyticsPage() {
|
||||
<div className="bg-white/[0.02] border border-white/[0.06] rounded-xl p-4">
|
||||
<div className="flex items-center gap-2 mb-2">
|
||||
<CheckCircle className="h-4 w-4 text-emerald-400" />
|
||||
<span className="text-xs text-white/50">Success Rate</span>
|
||||
<span className="text-xs text-white/50">Mission Success Rate</span>
|
||||
</div>
|
||||
<div className="text-2xl font-semibold text-white">
|
||||
{((stats?.success_rate ?? 1) * 100).toFixed(0)}%
|
||||
{(missionStats.successRate * 100).toFixed(0)}%
|
||||
</div>
|
||||
<div className="text-xs text-white/40 mt-1">
|
||||
{stats?.completed_tasks ?? 0} completed, {stats?.failed_tasks ?? 0} failed
|
||||
{missionStats.completed} missions completed, {missionStats.failed} failed
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -1192,7 +1192,8 @@ export default function ControlClient() {
|
||||
>;
|
||||
updated[existingIdx] = {
|
||||
...existing,
|
||||
content: existing.content + "\n\n---\n\n" + content,
|
||||
// Replace content instead of appending - backend sends cumulative content
|
||||
content,
|
||||
done,
|
||||
};
|
||||
return updated;
|
||||
|
||||
@@ -4,7 +4,7 @@ import { useEffect, useState, useRef, useMemo, useCallback } from "react";
|
||||
import Link from "next/link";
|
||||
import { toast } from "sonner";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { listMissions, getMissionTree, Mission } from "@/lib/api";
|
||||
import { listMissions, getMissionTree, deleteMission, cleanupEmptyMissions, Mission } from "@/lib/api";
|
||||
import { ShimmerTableRow } from "@/components/ui/shimmer";
|
||||
import { CopyButton } from "@/components/ui/copy-button";
|
||||
import { RelativeTime } from "@/components/ui/relative-time";
|
||||
@@ -24,6 +24,8 @@ import {
|
||||
ArrowDown,
|
||||
Network,
|
||||
X,
|
||||
Trash2,
|
||||
Sparkles,
|
||||
} from "lucide-react";
|
||||
|
||||
const statusIcons: Record<string, typeof Clock> = {
|
||||
@@ -117,10 +119,14 @@ export default function HistoryPage() {
|
||||
const [previewMissionId, setPreviewMissionId] = useState<string | null>(null);
|
||||
const [previewTree, setPreviewTree] = useState<AgentNode | null>(null);
|
||||
const [loadingTree, setLoadingTree] = useState(false);
|
||||
|
||||
|
||||
// Track the mission ID being fetched to prevent race conditions
|
||||
const fetchingTreeMissionIdRef = useRef<string | null>(null);
|
||||
|
||||
// Delete state
|
||||
const [deletingMissionId, setDeletingMissionId] = useState<string | null>(null);
|
||||
const [cleaningUp, setCleaningUp] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
if (fetchedRef.current) return;
|
||||
fetchedRef.current = true;
|
||||
@@ -193,6 +199,49 @@ export default function HistoryPage() {
|
||||
}
|
||||
};
|
||||
|
||||
const handleDeleteMission = useCallback(async (missionId: string, e: React.MouseEvent) => {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
|
||||
const mission = missions.find(m => m.id === missionId);
|
||||
if (mission?.status === "active") {
|
||||
toast.error("Cannot delete an active mission");
|
||||
return;
|
||||
}
|
||||
|
||||
setDeletingMissionId(missionId);
|
||||
try {
|
||||
await deleteMission(missionId);
|
||||
setMissions(prev => prev.filter(m => m.id !== missionId));
|
||||
toast.success("Mission deleted");
|
||||
} catch (error) {
|
||||
console.error("Failed to delete mission:", error);
|
||||
toast.error("Failed to delete mission");
|
||||
} finally {
|
||||
setDeletingMissionId(null);
|
||||
}
|
||||
}, [missions]);
|
||||
|
||||
const handleCleanupEmpty = useCallback(async () => {
|
||||
setCleaningUp(true);
|
||||
try {
|
||||
const result = await cleanupEmptyMissions();
|
||||
if (result.deleted_count > 0) {
|
||||
// Refresh the missions list
|
||||
const missionsData = await listMissions().catch(() => []);
|
||||
setMissions(missionsData);
|
||||
toast.success(`Cleaned up ${result.deleted_count} empty mission${result.deleted_count === 1 ? '' : 's'}`);
|
||||
} else {
|
||||
toast.info("No empty missions to clean up");
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to cleanup missions:", error);
|
||||
toast.error("Failed to cleanup missions");
|
||||
} finally {
|
||||
setCleaningUp(false);
|
||||
}
|
||||
}, []);
|
||||
|
||||
const filteredMissions = useMemo(() => {
|
||||
const filtered = missions.filter((mission) => {
|
||||
if (filter !== "all" && mission.status !== filter) return false;
|
||||
@@ -262,6 +311,24 @@ export default function HistoryPage() {
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<button
|
||||
onClick={handleCleanupEmpty}
|
||||
disabled={cleaningUp}
|
||||
className={cn(
|
||||
"inline-flex items-center gap-2 px-3 py-2 rounded-lg text-xs font-medium transition-colors",
|
||||
"bg-white/[0.02] border border-white/[0.04] hover:bg-white/[0.04]",
|
||||
"text-white/60 hover:text-white/80",
|
||||
cleaningUp && "opacity-50 cursor-not-allowed"
|
||||
)}
|
||||
>
|
||||
{cleaningUp ? (
|
||||
<Loader className="h-3.5 w-3.5 animate-spin" />
|
||||
) : (
|
||||
<Sparkles className="h-3.5 w-3.5" />
|
||||
)}
|
||||
Cleanup Empty
|
||||
</button>
|
||||
</div>
|
||||
|
||||
{/* Content */}
|
||||
@@ -419,6 +486,25 @@ export default function HistoryPage() {
|
||||
>
|
||||
<Network className="h-3 w-3" />
|
||||
</button>
|
||||
<button
|
||||
onClick={(e) => handleDeleteMission(mission.id, e)}
|
||||
disabled={deletingMissionId === mission.id || mission.status === "active"}
|
||||
className={cn(
|
||||
"inline-flex items-center gap-1 text-xs transition-colors opacity-0 group-hover:opacity-100",
|
||||
deletingMissionId === mission.id
|
||||
? "text-white/30 cursor-not-allowed"
|
||||
: mission.status === "active"
|
||||
? "text-white/20 cursor-not-allowed"
|
||||
: "text-white/40 hover:text-red-400"
|
||||
)}
|
||||
title={mission.status === "active" ? "Cannot delete active mission" : "Delete mission"}
|
||||
>
|
||||
{deletingMissionId === mission.id ? (
|
||||
<Loader className="h-3 w-3 animate-spin" />
|
||||
) : (
|
||||
<Trash2 className="h-3 w-3" />
|
||||
)}
|
||||
</button>
|
||||
<CopyButton
|
||||
text={mission.id}
|
||||
showOnHover
|
||||
|
||||
@@ -396,6 +396,30 @@ export async function setMissionStatus(
|
||||
if (!res.ok) throw new Error("Failed to set mission status");
|
||||
}
|
||||
|
||||
// Delete a mission
|
||||
export async function deleteMission(id: string): Promise<{ ok: boolean; deleted: string }> {
|
||||
const res = await apiFetch(`/api/control/missions/${id}`, {
|
||||
method: "DELETE",
|
||||
});
|
||||
if (!res.ok) {
|
||||
const text = await res.text();
|
||||
throw new Error(`Failed to delete mission: ${text}`);
|
||||
}
|
||||
return res.json();
|
||||
}
|
||||
|
||||
// Cleanup empty untitled missions
|
||||
export async function cleanupEmptyMissions(): Promise<{ ok: boolean; deleted_count: number }> {
|
||||
const res = await apiFetch("/api/control/missions/cleanup", {
|
||||
method: "POST",
|
||||
});
|
||||
if (!res.ok) {
|
||||
const text = await res.text();
|
||||
throw new Error(`Failed to cleanup missions: ${text}`);
|
||||
}
|
||||
return res.json();
|
||||
}
|
||||
|
||||
// Resume an interrupted mission
|
||||
export async function resumeMission(id: string, cleanWorkspace: boolean = false): Promise<Mission> {
|
||||
const res = await apiFetch(`/api/control/missions/${id}/resume`, {
|
||||
|
||||
@@ -105,6 +105,29 @@ final class APIService {
|
||||
func cancelMission(id: String) async throws {
|
||||
let _: EmptyResponse = try await post("/api/control/missions/\(id)/cancel", body: EmptyBody())
|
||||
}
|
||||
|
||||
func deleteMission(id: String) async throws -> Bool {
|
||||
struct DeleteResponse: Decodable {
|
||||
let ok: Bool
|
||||
let deleted: String
|
||||
}
|
||||
let response: DeleteResponse = try await delete("/api/control/missions/\(id)")
|
||||
return response.ok
|
||||
}
|
||||
|
||||
func cleanupEmptyMissions() async throws -> Int {
|
||||
struct CleanupResponse: Decodable {
|
||||
let ok: Bool
|
||||
let deletedCount: Int
|
||||
|
||||
enum CodingKeys: String, CodingKey {
|
||||
case ok
|
||||
case deletedCount = "deleted_count"
|
||||
}
|
||||
}
|
||||
let response: CleanupResponse = try await post("/api/control/missions/cleanup", body: EmptyBody())
|
||||
return response.deletedCount
|
||||
}
|
||||
|
||||
// MARK: - Parallel Missions
|
||||
|
||||
@@ -324,17 +347,32 @@ final class APIService {
|
||||
guard let url = URL(string: "\(baseURL)\(path)") else {
|
||||
throw APIError.invalidURL
|
||||
}
|
||||
|
||||
|
||||
var request = URLRequest(url: url)
|
||||
request.httpMethod = "POST"
|
||||
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
|
||||
|
||||
|
||||
if authenticated, let token = jwtToken {
|
||||
request.setValue("Bearer \(token)", forHTTPHeaderField: "Authorization")
|
||||
}
|
||||
|
||||
|
||||
request.httpBody = try JSONEncoder().encode(body)
|
||||
|
||||
|
||||
return try await execute(request)
|
||||
}
|
||||
|
||||
private func delete<T: Decodable>(_ path: String, authenticated: Bool = true) async throws -> T {
|
||||
guard let url = URL(string: "\(baseURL)\(path)") else {
|
||||
throw APIError.invalidURL
|
||||
}
|
||||
|
||||
var request = URLRequest(url: url)
|
||||
request.httpMethod = "DELETE"
|
||||
|
||||
if authenticated, let token = jwtToken {
|
||||
request.setValue("Bearer \(token)", forHTTPHeaderField: "Authorization")
|
||||
}
|
||||
|
||||
return try await execute(request)
|
||||
}
|
||||
|
||||
|
||||
@@ -15,7 +15,9 @@ struct HistoryView: View {
|
||||
@State private var searchText = ""
|
||||
@State private var selectedFilter: StatusFilter = .all
|
||||
@State private var errorMessage: String?
|
||||
|
||||
@State private var isCleaningUp = false
|
||||
@State private var showCleanupResult: String?
|
||||
|
||||
private let api = APIService.shared
|
||||
private let nav = NavigationState.shared
|
||||
|
||||
@@ -81,24 +83,79 @@ struct HistoryView: View {
|
||||
.stroke(Theme.border, lineWidth: 1)
|
||||
)
|
||||
|
||||
// Filter pills
|
||||
ScrollView(.horizontal, showsIndicators: false) {
|
||||
HStack(spacing: 8) {
|
||||
ForEach(StatusFilter.allCases, id: \.rawValue) { filter in
|
||||
FilterPill(
|
||||
title: filter.rawValue,
|
||||
isSelected: selectedFilter == filter
|
||||
) {
|
||||
withAnimation(.easeInOut(duration: 0.2)) {
|
||||
selectedFilter = filter
|
||||
// Filter pills and cleanup button
|
||||
HStack(spacing: 12) {
|
||||
ScrollView(.horizontal, showsIndicators: false) {
|
||||
HStack(spacing: 8) {
|
||||
ForEach(StatusFilter.allCases, id: \.rawValue) { filter in
|
||||
FilterPill(
|
||||
title: filter.rawValue,
|
||||
isSelected: selectedFilter == filter
|
||||
) {
|
||||
withAnimation(.easeInOut(duration: 0.2)) {
|
||||
selectedFilter = filter
|
||||
}
|
||||
HapticService.selectionChanged()
|
||||
}
|
||||
HapticService.selectionChanged()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Cleanup button
|
||||
Button {
|
||||
Task { await cleanupEmptyMissions() }
|
||||
} label: {
|
||||
HStack(spacing: 6) {
|
||||
if isCleaningUp {
|
||||
ProgressView()
|
||||
.scaleEffect(0.7)
|
||||
.tint(Theme.textSecondary)
|
||||
} else {
|
||||
Image(systemName: "sparkles")
|
||||
.font(.caption)
|
||||
}
|
||||
Text("Cleanup")
|
||||
.font(.caption.weight(.medium))
|
||||
}
|
||||
.foregroundStyle(Theme.textSecondary)
|
||||
.padding(.horizontal, 12)
|
||||
.padding(.vertical, 8)
|
||||
.background(.ultraThinMaterial)
|
||||
.clipShape(Capsule())
|
||||
.overlay(
|
||||
Capsule()
|
||||
.stroke(Theme.border, lineWidth: 0.5)
|
||||
)
|
||||
}
|
||||
.disabled(isCleaningUp)
|
||||
.opacity(isCleaningUp ? 0.6 : 1)
|
||||
}
|
||||
}
|
||||
.padding()
|
||||
|
||||
// Cleanup result banner
|
||||
if let result = showCleanupResult {
|
||||
HStack {
|
||||
Image(systemName: "checkmark.circle.fill")
|
||||
.foregroundStyle(Theme.success)
|
||||
Text(result)
|
||||
.font(.subheadline)
|
||||
.foregroundStyle(Theme.textPrimary)
|
||||
Spacer()
|
||||
Button {
|
||||
withAnimation { showCleanupResult = nil }
|
||||
} label: {
|
||||
Image(systemName: "xmark")
|
||||
.font(.caption)
|
||||
.foregroundStyle(Theme.textTertiary)
|
||||
}
|
||||
}
|
||||
.padding()
|
||||
.background(Theme.success.opacity(0.1))
|
||||
.clipShape(RoundedRectangle(cornerRadius: 10))
|
||||
.padding(.horizontal)
|
||||
.transition(.move(edge: .top).combined(with: .opacity))
|
||||
}
|
||||
|
||||
// Content
|
||||
if isLoading {
|
||||
@@ -145,6 +202,15 @@ struct HistoryView: View {
|
||||
MissionRow(mission: mission)
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
.swipeActions(edge: .trailing, allowsFullSwipe: false) {
|
||||
if mission.status != .active {
|
||||
Button(role: .destructive) {
|
||||
Task { await deleteMission(mission) }
|
||||
} label: {
|
||||
Label("Delete", systemImage: "trash")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} header: {
|
||||
SectionHeader(
|
||||
@@ -189,23 +255,70 @@ struct HistoryView: View {
|
||||
private func loadData() async {
|
||||
isLoading = true
|
||||
errorMessage = nil
|
||||
|
||||
|
||||
do {
|
||||
async let missionsTask = api.listMissions()
|
||||
async let tasksTask = api.listTasks()
|
||||
async let runsTask = api.listRuns()
|
||||
|
||||
|
||||
let (missionsResult, tasksResult, runsResult) = try await (missionsTask, tasksTask, runsTask)
|
||||
|
||||
|
||||
missions = missionsResult
|
||||
tasks = tasksResult
|
||||
runs = runsResult
|
||||
} catch {
|
||||
errorMessage = error.localizedDescription
|
||||
}
|
||||
|
||||
|
||||
isLoading = false
|
||||
}
|
||||
|
||||
private func deleteMission(_ mission: Mission) async {
|
||||
do {
|
||||
_ = try await api.deleteMission(id: mission.id)
|
||||
withAnimation {
|
||||
missions.removeAll { $0.id == mission.id }
|
||||
}
|
||||
HapticService.success()
|
||||
} catch {
|
||||
HapticService.error()
|
||||
errorMessage = "Failed to delete mission: \(error.localizedDescription)"
|
||||
}
|
||||
}
|
||||
|
||||
private func cleanupEmptyMissions() async {
|
||||
isCleaningUp = true
|
||||
|
||||
do {
|
||||
let count = try await api.cleanupEmptyMissions()
|
||||
if count > 0 {
|
||||
// Refresh the list
|
||||
let newMissions = try await api.listMissions()
|
||||
withAnimation {
|
||||
missions = newMissions
|
||||
showCleanupResult = "Cleaned up \(count) empty mission\(count == 1 ? "" : "s")"
|
||||
}
|
||||
HapticService.success()
|
||||
} else {
|
||||
withAnimation {
|
||||
showCleanupResult = "No empty missions to clean up"
|
||||
}
|
||||
}
|
||||
|
||||
// Auto-hide the result after 3 seconds
|
||||
Task {
|
||||
try? await Task.sleep(for: .seconds(3))
|
||||
withAnimation {
|
||||
showCleanupResult = nil
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
HapticService.error()
|
||||
errorMessage = "Cleanup failed: \(error.localizedDescription)"
|
||||
}
|
||||
|
||||
isCleaningUp = false
|
||||
}
|
||||
}
|
||||
|
||||
// MARK: - Supporting Views
|
||||
|
||||
18
opencode.json
Normal file
18
opencode.json
Normal file
@@ -0,0 +1,18 @@
|
||||
{
|
||||
"$schema": "https://opencode.ai/config.json",
|
||||
"mcp": {
|
||||
"desktop": {
|
||||
"type": "local",
|
||||
"command": ["./target/release/desktop-mcp"],
|
||||
"enabled": true,
|
||||
"environment": {
|
||||
"DESKTOP_RESOLUTION": "1920x1080"
|
||||
}
|
||||
},
|
||||
"playwright": {
|
||||
"type": "local",
|
||||
"command": ["npx", "@playwright/mcp@latest"],
|
||||
"enabled": true
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -688,6 +688,10 @@ If you cannot perform the requested analysis, use `complete_mission(blocked, rea
|
||||
// Track consecutive empty/reasoning-only responses (P0 fix for agent stalls)
|
||||
let mut empty_response_count: u32 = 0;
|
||||
|
||||
// Cumulative thinking content - we append each iteration's thinking so the frontend
|
||||
// can replace (not append) and still see all thinking. This matches OpenCode behavior.
|
||||
let mut cumulative_thinking = String::new();
|
||||
|
||||
// Track failed tool attempts by category (P3 fix for approach looping)
|
||||
let mut failure_tracker = ToolFailureTracker::new();
|
||||
|
||||
@@ -929,11 +933,19 @@ If you cannot perform the requested analysis, use `complete_mission(blocked, rea
|
||||
};
|
||||
|
||||
// Emit thinking event if there's content (agent reasoning)
|
||||
// We accumulate thinking content across iterations and send cumulative content,
|
||||
// so the frontend can replace (not append) and still see all thinking.
|
||||
if let Some(ref content) = response.content {
|
||||
if !content.is_empty() {
|
||||
// Append to cumulative with separator if not first
|
||||
if !cumulative_thinking.is_empty() {
|
||||
cumulative_thinking.push_str("\n\n---\n\n");
|
||||
}
|
||||
cumulative_thinking.push_str(content);
|
||||
|
||||
if let Some(events) = &ctx.control_events {
|
||||
let _ = events.send(AgentEvent::Thinking {
|
||||
content: content.clone(),
|
||||
content: cumulative_thinking.clone(),
|
||||
done: response.tool_calls.is_none(),
|
||||
mission_id: ctx.mission_id,
|
||||
});
|
||||
|
||||
@@ -1006,11 +1006,144 @@ pub async fn resume_mission(
|
||||
/// Get parallel execution configuration.
|
||||
pub async fn get_parallel_config(
|
||||
State(state): State<Arc<AppState>>,
|
||||
) -> Json<serde_json::Value> {
|
||||
Json(serde_json::json!({
|
||||
) -> Result<Json<serde_json::Value>, (StatusCode, String)> {
|
||||
// Query actual running count from the control actor
|
||||
// (the running state is tracked in the actor loop, not in shared state)
|
||||
let (tx, rx) = oneshot::channel();
|
||||
state
|
||||
.control
|
||||
.cmd_tx
|
||||
.send(ControlCommand::ListRunning { respond: tx })
|
||||
.await
|
||||
.map_err(|_| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
"control session unavailable".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
let running = rx.await.map_err(|_| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
"Failed to get running missions".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
Ok(Json(serde_json::json!({
|
||||
"max_parallel_missions": state.control.max_parallel,
|
||||
"running_count": state.control.running_missions.read().await.len(),
|
||||
}))
|
||||
"running_count": running.len(),
|
||||
})))
|
||||
}
|
||||
|
||||
/// Delete a mission by ID.
|
||||
/// Only allows deleting missions that are not currently running.
|
||||
pub async fn delete_mission(
|
||||
State(state): State<Arc<AppState>>,
|
||||
Path(mission_id): Path<Uuid>,
|
||||
) -> Result<Json<serde_json::Value>, (StatusCode, String)> {
|
||||
// Check if mission is currently running by querying the control actor
|
||||
// (the actual running state is tracked in the actor loop, not in shared state)
|
||||
let (tx, rx) = oneshot::channel();
|
||||
state
|
||||
.control
|
||||
.cmd_tx
|
||||
.send(ControlCommand::ListRunning { respond: tx })
|
||||
.await
|
||||
.map_err(|_| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
"control session unavailable".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
let running = rx.await.map_err(|_| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
"Failed to check running missions".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
if running.iter().any(|m| m.mission_id == mission_id) {
|
||||
return Err((
|
||||
StatusCode::CONFLICT,
|
||||
"Cannot delete a running mission. Cancel it first.".to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
// Get memory system
|
||||
let mem = state.memory.as_ref().ok_or_else(|| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
"Memory system not available".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
// Delete the mission
|
||||
let deleted = mem
|
||||
.supabase
|
||||
.delete_mission(mission_id)
|
||||
.await
|
||||
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, e.to_string()))?;
|
||||
|
||||
if deleted {
|
||||
Ok(Json(serde_json::json!({
|
||||
"ok": true,
|
||||
"deleted": mission_id
|
||||
})))
|
||||
} else {
|
||||
Err((StatusCode::NOT_FOUND, "Mission not found".to_string()))
|
||||
}
|
||||
}
|
||||
|
||||
/// Delete all empty "Untitled" missions.
|
||||
/// Returns the count of deleted missions.
|
||||
/// Note: This excludes any currently running missions to prevent data loss.
|
||||
pub async fn cleanup_empty_missions(
|
||||
State(state): State<Arc<AppState>>,
|
||||
) -> Result<Json<serde_json::Value>, (StatusCode, String)> {
|
||||
// Get currently running mission IDs to exclude from cleanup
|
||||
// (a newly-started mission may have empty history in DB while actively running)
|
||||
let (tx, rx) = oneshot::channel();
|
||||
state
|
||||
.control
|
||||
.cmd_tx
|
||||
.send(ControlCommand::ListRunning { respond: tx })
|
||||
.await
|
||||
.map_err(|_| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
"control session unavailable".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
let running = rx.await.map_err(|_| {
|
||||
(
|
||||
StatusCode::INTERNAL_SERVER_ERROR,
|
||||
"Failed to check running missions".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
let running_ids: Vec<Uuid> = running.iter().map(|m| m.mission_id).collect();
|
||||
|
||||
// Get memory system
|
||||
let mem = state.memory.as_ref().ok_or_else(|| {
|
||||
(
|
||||
StatusCode::SERVICE_UNAVAILABLE,
|
||||
"Memory system not available".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
// Delete empty untitled missions, excluding running ones
|
||||
let count = mem
|
||||
.supabase
|
||||
.delete_empty_untitled_missions_excluding(&running_ids)
|
||||
.await
|
||||
.map_err(|e| (StatusCode::INTERNAL_SERVER_ERROR, e.to_string()))?;
|
||||
|
||||
Ok(Json(serde_json::json!({
|
||||
"ok": true,
|
||||
"deleted_count": count
|
||||
})))
|
||||
}
|
||||
|
||||
/// Stream control session events via SSE.
|
||||
@@ -1718,16 +1851,17 @@ async fn control_actor_loop(
|
||||
// First check parallel runners
|
||||
if let Some(runner) = parallel_runners.get_mut(&mission_id) {
|
||||
runner.cancel();
|
||||
let _ = events_tx.send(AgentEvent::Error {
|
||||
let _ = events_tx.send(AgentEvent::Error {
|
||||
message: format!("Parallel mission {} cancelled", mission_id),
|
||||
mission_id: Some(mission_id),
|
||||
});
|
||||
parallel_runners.remove(&mission_id);
|
||||
let _ = respond.send(Ok(()));
|
||||
} else {
|
||||
// Check if this is the current running mission
|
||||
let current = current_mission.read().await.clone();
|
||||
if current == Some(mission_id) {
|
||||
// Check if this is the currently executing mission
|
||||
// Use running_mission_id (the actual mission being executed)
|
||||
// instead of current_mission (which can change when user creates a new mission)
|
||||
if running_mission_id == Some(mission_id) {
|
||||
// Cancel the current execution
|
||||
if let Some(token) = &running_cancel {
|
||||
token.cancel();
|
||||
@@ -1862,18 +1996,30 @@ async fn control_actor_loop(
|
||||
let mut interrupted_ids = Vec::new();
|
||||
|
||||
// Handle main mission - use running_mission_id (the actual mission being executed)
|
||||
// Note: We DON'T persist history here because:
|
||||
// 1. If current_mission == running_mission_id, history is correct
|
||||
// 2. If current_mission != running_mission_id (user created new mission),
|
||||
// history was cleared and doesn't belong to running_mission_id
|
||||
// The running mission's history is already in DB from previous exchanges,
|
||||
// and any in-progress exchange will be lost (acceptable for shutdown).
|
||||
if running.is_some() {
|
||||
if let Some(mission_id) = running_mission_id {
|
||||
// Persist current history before marking as interrupted
|
||||
persist_mission_history(&memory, ¤t_mission, &history).await;
|
||||
|
||||
// Only persist if the running mission is still current mission
|
||||
// (i.e., user didn't create a new mission while this one was running)
|
||||
let current_mid = current_mission.read().await.clone();
|
||||
if current_mid == Some(mission_id) {
|
||||
persist_mission_history(&memory, ¤t_mission, &history).await;
|
||||
}
|
||||
// Note: If missions differ, don't persist - the local history
|
||||
// belongs to current_mission, not running_mission_id
|
||||
|
||||
if let Some(mem) = &memory {
|
||||
if let Ok(()) = mem.supabase.update_mission_status(mission_id, "interrupted").await {
|
||||
interrupted_ids.push(mission_id);
|
||||
tracing::info!("Marked mission {} as interrupted", mission_id);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Cancel execution
|
||||
if let Some(token) = &running_cancel {
|
||||
token.cancel();
|
||||
@@ -1981,17 +2127,67 @@ async fn control_actor_loop(
|
||||
}
|
||||
}, if running.is_some() => {
|
||||
if let Some(res) = finished {
|
||||
// Save the running mission ID before clearing it - we need it for persist and auto-complete
|
||||
// (current_mission can change if user clicks "New Mission" while task was running)
|
||||
let completed_mission_id = running_mission_id;
|
||||
running = None;
|
||||
running_cancel = None;
|
||||
running_mission_id = None;
|
||||
match res {
|
||||
Ok((_mid, user_msg, agent_result)) => {
|
||||
// Append to conversation history.
|
||||
history.push(("user".to_string(), user_msg));
|
||||
history.push(("assistant".to_string(), agent_result.output.clone()));
|
||||
// Only append to local history if this mission is still the current mission.
|
||||
// If the user created a new mission mid-execution, history was cleared for that new mission,
|
||||
// and we don't want to contaminate it with the old mission's exchange.
|
||||
let current_mid = current_mission.read().await.clone();
|
||||
if completed_mission_id == current_mid {
|
||||
history.push(("user".to_string(), user_msg.clone()));
|
||||
history.push(("assistant".to_string(), agent_result.output.clone()));
|
||||
}
|
||||
|
||||
// Persist to mission
|
||||
persist_mission_history(&memory, ¤t_mission, &history).await;
|
||||
// Persist to mission using the actual completed mission ID
|
||||
// (not current_mission, which could have changed)
|
||||
//
|
||||
// IMPORTANT: We fetch existing history from DB and append, rather than
|
||||
// using the local `history` variable, because CreateMission may have
|
||||
// cleared `history` while this task was running. This prevents data loss.
|
||||
if let (Some(mem), Some(mid)) = (&memory, completed_mission_id) {
|
||||
// Fetch existing history from DB
|
||||
let existing_history: Vec<MissionHistoryEntry> = match mem.supabase.get_mission(mid).await {
|
||||
Ok(Some(mission)) => {
|
||||
serde_json::from_value(mission.history).unwrap_or_default()
|
||||
}
|
||||
_ => Vec::new(),
|
||||
};
|
||||
|
||||
// Append new messages to existing history
|
||||
let mut messages: Vec<MissionMessage> = existing_history
|
||||
.iter()
|
||||
.map(|e| MissionMessage {
|
||||
role: e.role.clone(),
|
||||
content: e.content.clone(),
|
||||
})
|
||||
.collect();
|
||||
messages.push(MissionMessage { role: "user".to_string(), content: user_msg.clone() });
|
||||
messages.push(MissionMessage { role: "assistant".to_string(), content: agent_result.output.clone() });
|
||||
|
||||
if let Err(e) = mem.supabase.update_mission_history(mid, &messages).await {
|
||||
tracing::warn!("Failed to persist mission history: {}", e);
|
||||
}
|
||||
|
||||
// Update title from first user message if not set (only on first exchange)
|
||||
if existing_history.is_empty() {
|
||||
// Use safe_truncate_index for UTF-8 safe truncation, matching persist_mission_history
|
||||
let title = if user_msg.len() > 100 {
|
||||
let safe_end = crate::memory::safe_truncate_index(&user_msg, 100);
|
||||
format!("{}...", &user_msg[..safe_end])
|
||||
} else {
|
||||
user_msg.clone()
|
||||
};
|
||||
if let Err(e) = mem.supabase.update_mission_title(mid, &title).await {
|
||||
tracing::warn!("Failed to update mission title: {}", e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// P1 FIX: Auto-complete mission if agent execution ended in a terminal state
|
||||
// without an explicit complete_mission call.
|
||||
@@ -2004,7 +2200,9 @@ async fn control_actor_loop(
|
||||
// - Parallel missions (each has its own DB status)
|
||||
if agent_result.terminal_reason.is_some() {
|
||||
if let Some(mem) = &memory {
|
||||
if let Some(mission_id) = current_mission.read().await.clone() {
|
||||
// Use completed_mission_id (the actual mission that just finished)
|
||||
// instead of current_mission (which can change when user creates a new mission)
|
||||
if let Some(mission_id) = completed_mission_id {
|
||||
// Check current mission status from DB - only auto-complete if still "active"
|
||||
let current_status = mem.supabase.get_mission(mission_id).await
|
||||
.ok()
|
||||
|
||||
@@ -133,6 +133,9 @@ pub async fn serve(config: Config) -> anyhow::Result<()> {
|
||||
.route("/api/control/missions/:id/cancel", post(control::cancel_mission))
|
||||
.route("/api/control/missions/:id/resume", post(control::resume_mission))
|
||||
.route("/api/control/missions/:id/parallel", post(control::start_mission_parallel))
|
||||
.route("/api/control/missions/:id", axum::routing::delete(control::delete_mission))
|
||||
// Mission cleanup
|
||||
.route("/api/control/missions/cleanup", post(control::cleanup_empty_missions))
|
||||
// Parallel execution endpoints
|
||||
.route("/api/control/running", get(control::list_running_missions))
|
||||
.route("/api/control/parallel/config", get(control::get_parallel_config))
|
||||
@@ -273,8 +276,12 @@ async fn get_stats(State(state): State<Arc<AppState>>) -> Json<StatsResponse> {
|
||||
.filter(|t| t.status == TaskStatus::Failed)
|
||||
.count();
|
||||
|
||||
// Calculate total cost (would need to track this properly in production)
|
||||
let total_cost_cents = 0u64; // TODO: Track actual costs
|
||||
// Calculate total cost from runs in database
|
||||
let total_cost_cents = if let Some(mem) = &state.memory {
|
||||
mem.supabase.get_total_cost_cents().await.unwrap_or(0)
|
||||
} else {
|
||||
0
|
||||
};
|
||||
|
||||
let finished = completed_tasks + failed_tasks;
|
||||
let success_rate = if finished > 0 {
|
||||
|
||||
1027
src/bin/desktop_mcp.rs
Normal file
1027
src/bin/desktop_mcp.rs
Normal file
File diff suppressed because it is too large
Load Diff
@@ -105,9 +105,35 @@ impl SupabaseClient {
|
||||
.header("Authorization", format!("Bearer {}", self.service_role_key))
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
|
||||
Ok(resp.json().await?)
|
||||
}
|
||||
|
||||
/// Get total cost across all runs (in cents).
|
||||
pub async fn get_total_cost_cents(&self) -> anyhow::Result<u64> {
|
||||
// Fetch only the total_cost_cents column for efficiency
|
||||
#[derive(serde::Deserialize)]
|
||||
struct CostOnly {
|
||||
total_cost_cents: Option<i64>,
|
||||
}
|
||||
|
||||
let resp = self.client
|
||||
.get(format!(
|
||||
"{}/runs?select=total_cost_cents",
|
||||
self.rest_url()
|
||||
))
|
||||
.header("apikey", &self.service_role_key)
|
||||
.header("Authorization", format!("Bearer {}", self.service_role_key))
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
let costs: Vec<CostOnly> = resp.json().await?;
|
||||
let total: i64 = costs.iter()
|
||||
.filter_map(|c| c.total_cost_cents)
|
||||
.sum();
|
||||
|
||||
Ok(total.max(0) as u64)
|
||||
}
|
||||
|
||||
// ==================== Tasks ====================
|
||||
|
||||
@@ -686,9 +712,94 @@ impl SupabaseClient {
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
||||
/// Delete a mission by ID.
|
||||
/// Returns true if the mission was deleted, false if it didn't exist.
|
||||
pub async fn delete_mission(&self, id: Uuid) -> anyhow::Result<bool> {
|
||||
let resp = self.client
|
||||
.delete(format!("{}/missions?id=eq.{}", self.rest_url(), id))
|
||||
.header("apikey", &self.service_role_key)
|
||||
.header("Authorization", format!("Bearer {}", self.service_role_key))
|
||||
.header("Prefer", "return=representation")
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let text = resp.text().await?;
|
||||
anyhow::bail!("Failed to delete mission: {}", text);
|
||||
}
|
||||
|
||||
// Check if anything was actually deleted
|
||||
let deleted: Vec<DbMission> = resp.json().await?;
|
||||
Ok(!deleted.is_empty())
|
||||
}
|
||||
|
||||
/// Delete all empty "Untitled" missions (no history, no title set).
|
||||
/// Returns the count of deleted missions.
|
||||
pub async fn delete_empty_untitled_missions(&self) -> anyhow::Result<usize> {
|
||||
self.delete_empty_untitled_missions_excluding(&[]).await
|
||||
}
|
||||
|
||||
/// Delete all empty "Untitled" missions (no history, no title set),
|
||||
/// excluding the specified mission IDs (e.g., currently running missions).
|
||||
/// Returns the count of deleted missions.
|
||||
pub async fn delete_empty_untitled_missions_excluding(&self, exclude_ids: &[Uuid]) -> anyhow::Result<usize> {
|
||||
// Minimal struct for partial field selection - avoids deserialization errors
|
||||
// when querying only id, title, history fields (DbMission has more required fields)
|
||||
#[derive(serde::Deserialize)]
|
||||
struct PartialMission {
|
||||
id: Uuid,
|
||||
#[allow(dead_code)]
|
||||
title: Option<String>,
|
||||
history: serde_json::Value,
|
||||
}
|
||||
|
||||
// First get missions with null or "Untitled Mission" title and empty history
|
||||
let resp = self.client
|
||||
.get(format!(
|
||||
"{}/missions?select=id,title,history&or=(title.is.null,title.eq.Untitled%20Mission)&status=eq.active",
|
||||
self.rest_url()
|
||||
))
|
||||
.header("apikey", &self.service_role_key)
|
||||
.header("Authorization", format!("Bearer {}", self.service_role_key))
|
||||
.send()
|
||||
.await?;
|
||||
|
||||
if !resp.status().is_success() {
|
||||
let text = resp.text().await?;
|
||||
anyhow::bail!("Failed to query empty missions: {}", text);
|
||||
}
|
||||
|
||||
let missions: Vec<PartialMission> = resp.json().await?;
|
||||
|
||||
// Filter to only those with empty history (history is a JSON array)
|
||||
// and not in the exclude list (e.g., currently running missions)
|
||||
let empty_ids: Vec<Uuid> = missions
|
||||
.into_iter()
|
||||
.filter(|m| {
|
||||
m.history.as_array().map_or(true, |arr| arr.is_empty())
|
||||
})
|
||||
.filter(|m| !exclude_ids.contains(&m.id))
|
||||
.map(|m| m.id)
|
||||
.collect();
|
||||
|
||||
if empty_ids.is_empty() {
|
||||
return Ok(0);
|
||||
}
|
||||
|
||||
// Delete in batches
|
||||
let mut deleted_count = 0;
|
||||
for id in &empty_ids {
|
||||
if self.delete_mission(*id).await? {
|
||||
deleted_count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(deleted_count)
|
||||
}
|
||||
|
||||
// ==================== User Facts ====================
|
||||
|
||||
|
||||
/// Insert a user fact into memory.
|
||||
pub async fn insert_user_fact(
|
||||
&self,
|
||||
|
||||
Reference in New Issue
Block a user