chore: update client submodule and linting artifacts
- Updated client submodule to commit 87642fa. - Adjusted basedpyright linting configuration, downgrading version to 1.36.1 and reducing analysis time from 13.105s to 11.303s. - Added new documentation files for ongoing sprints addressing wiring gaps and error observability audits.
This commit is contained in:
@@ -1,13 +1,13 @@
|
||||
|
||||
{
|
||||
"version": "1.36.2",
|
||||
"time": "1767924682759",
|
||||
"version": "1.36.1",
|
||||
"time": "1767944627476",
|
||||
"generalDiagnostics": [],
|
||||
"summary": {
|
||||
"filesAnalyzed": 705,
|
||||
"errorCount": 0,
|
||||
"warningCount": 0,
|
||||
"informationCount": 0,
|
||||
"timeInSec": 13.105
|
||||
"timeInSec": 11.303
|
||||
}
|
||||
}
|
||||
|
||||
2
client
2
client
Submodule client updated: d8b1fb5375...87642fabcf
@@ -0,0 +1,312 @@
|
||||
# Sprint: Error Handling & Observability Hygiene (Audit)
|
||||
|
||||
> **Priority**: 1 | **Owner**: Backend + Client | **Complexity**: Medium | **Status**: Proposed
|
||||
> **Scope**: `src/` + `client/` | **Date**: 2026-01-09
|
||||
|
||||
---
|
||||
|
||||
## Objective
|
||||
|
||||
Eliminate duplicate error-handling patterns, suppressed failures, and non-centralized logging while keeping behavior unchanged. Execution **must** remove duplicates (not add wrappers), keep code clean, avoid compatibility layers, and pass all quality checks.
|
||||
|
||||
---
|
||||
|
||||
## Execution Guardrails (Read First)
|
||||
|
||||
**No compatibility layers or wrappers**
|
||||
- Do **not** introduce new “safeX”, “compat”, or “adapter” helpers.
|
||||
- Use **existing** primitives only: `fire_webhook_safe`, `clientLog` / `addClientLog`, `toastError`, `storage-utils`.
|
||||
- If you need a new log event, add a method to `clientLog` (existing module) rather than a new wrapper.
|
||||
|
||||
**No duplicates**
|
||||
- When consolidating, **remove or rewrite** duplicate implementations in the same change.
|
||||
- Do not leave legacy helpers in place “just in case”.
|
||||
|
||||
**Clean code**
|
||||
- Use structured logging fields already used in the codebase.
|
||||
- Keep method names and scopes consistent with adjacent modules.
|
||||
|
||||
**Quality gates**
|
||||
- Run `make quality` before completion.
|
||||
- If changes are non-trivial, run `pytest tests/quality/`.
|
||||
|
||||
---
|
||||
|
||||
## Findings Summary
|
||||
|
||||
| Category | Evidence (sample locations) | Impact |
|
||||
|---|---|---|
|
||||
| Duplicate fire-and-forget webhook handling | `src/noteflow/grpc/_mixins/errors/_webhooks.py:19`, `src/noteflow/grpc/_mixins/meeting/_stop_ops.py:72`, `src/noteflow/grpc/_mixins/summarization/_generation_mixin.py:96`, `src/noteflow/application/services/webhook_service.py:208` | Inconsistent logging/metadata and repeated error-handling code paths for same behavior. |
|
||||
| Silent UI data fetch failures | `client/src/pages/Home.tsx:25`, `client/src/pages/MeetingDetail.tsx:196`, `client/src/pages/Tasks.tsx:60` | UI can appear empty with no diagnostics; missing signals in client logs. |
|
||||
| Silent settings + secret load failures | `client/src/pages/Settings.tsx:134`, `client/src/components/settings/ai-config-section.tsx:50` | Secure storage failures and connection checks are suppressed with no logs or toasts. |
|
||||
| Non-centralized client logging (console.warn scattered) | `client/src/lib/storage-utils.ts:38`, `client/src/lib/cache/meeting-cache.ts:41`, `client/src/lib/preferences-sync.ts:59`, `client/src/contexts/workspace-context.tsx:22` | Events bypass client log buffer/analytics; inconsistent formatting and no correlation. |
|
||||
| localStorage helper duplication | `client/src/lib/storage-utils.ts:1` vs `client/src/contexts/storage.ts:1` and context-level helpers | Multiple patterns for same storage access and error handling; harder to harden or instrument. |
|
||||
| Unhandled storage write path | `client/src/lib/preferences-sync.ts:75` | `localStorage.setItem` lacks try/catch; could throw and break sync flow. |
|
||||
| Quiet trace context failures | `src/noteflow/infrastructure/logging/log_buffer.py:139` | OTel context extraction failures are silent; debugging trace gaps is harder. |
|
||||
|
||||
---
|
||||
|
||||
## Evidence + Excerpts (Current State)
|
||||
|
||||
### 1) Duplicate webhook error handling
|
||||
|
||||
**Existing centralized helper**
|
||||
|
||||
```py
|
||||
# src/noteflow/grpc/_mixins/errors/_webhooks.py:19
|
||||
async def fire_webhook_safe(webhook_coro: Awaitable[T], event_name: str) -> T | None:
|
||||
try:
|
||||
return await webhook_coro
|
||||
except Exception:
|
||||
logger.exception("Failed to trigger %s webhooks", event_name)
|
||||
return None
|
||||
```
|
||||
|
||||
**Duplicate local handlers**
|
||||
|
||||
```py
|
||||
# src/noteflow/grpc/_mixins/meeting/_stop_ops.py:72
|
||||
try:
|
||||
await webhook_service.trigger_recording_stopped(...)
|
||||
except Exception:
|
||||
logger.exception("Failed to trigger recording.stopped webhooks")
|
||||
```
|
||||
|
||||
```py
|
||||
# src/noteflow/grpc/_mixins/summarization/_generation_mixin.py:96
|
||||
try:
|
||||
meeting.summary = summary
|
||||
await self.webhook_service.trigger_summary_generated(meeting)
|
||||
except Exception:
|
||||
logger.exception("Failed to trigger summary.generated webhooks")
|
||||
```
|
||||
|
||||
**Callout**: Consolidate on `fire_webhook_safe` or the shared fire-and-forget pattern. Do not add a new helper wrapper.
|
||||
|
||||
---
|
||||
|
||||
### 2) Silent page-level fetch failures
|
||||
|
||||
```ts
|
||||
// client/src/pages/Home.tsx:25
|
||||
try {
|
||||
const response = await getAPI().listMeetings({ limit: 5, sort_order: 'newest' });
|
||||
setMeetings(response.meetings);
|
||||
} catch {
|
||||
// Error swallowed intentionally - meeting list load is non-critical for initial render
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
```
|
||||
|
||||
```ts
|
||||
// client/src/pages/MeetingDetail.tsx:196
|
||||
try {
|
||||
const data = await getAPI().getMeeting({ meeting_id: id, include_segments: true, include_summary: true });
|
||||
setMeeting(data);
|
||||
} catch {
|
||||
// Error swallowed intentionally - meeting fetch failure handled via loading state
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
```
|
||||
|
||||
**Callout**: These must report a non-critical error to `clientLog` without changing UX (no new toasts unless already used elsewhere).
|
||||
|
||||
---
|
||||
|
||||
### 3) Silent settings + secret hydration failures
|
||||
|
||||
```ts
|
||||
// client/src/pages/Settings.tsx:134
|
||||
try {
|
||||
const integrationsWithSecrets = await loadAllSecrets(preferences.getIntegrations());
|
||||
setIntegrations(integrationsWithSecrets);
|
||||
} catch {
|
||||
// Error swallowed intentionally - API key decryption failure is non-critical
|
||||
} finally {
|
||||
setLoadingApiKeys(false);
|
||||
}
|
||||
```
|
||||
|
||||
```ts
|
||||
// client/src/components/settings/ai-config-section.tsx:50
|
||||
try {
|
||||
const [transcriptionKey, summaryKey, embeddingKey] = await Promise.all([...]);
|
||||
...
|
||||
} catch {
|
||||
// Error swallowed intentionally - API key decryption failure is non-critical
|
||||
}
|
||||
```
|
||||
|
||||
**Callout**: Log to `clientLog` (non-fatal) and keep behavior intact.
|
||||
|
||||
---
|
||||
|
||||
### 4) Console logging bypasses client log buffer
|
||||
|
||||
```ts
|
||||
// client/src/lib/storage-utils.ts:38
|
||||
console.warn(`[Storage${context ? `:${context}` : ''}] Write failed ...`, error);
|
||||
```
|
||||
|
||||
```ts
|
||||
// client/src/lib/cache/meeting-cache.ts:41
|
||||
console.warn('[MeetingCache] Cache event listener error:', error);
|
||||
```
|
||||
|
||||
**Callout**: Replace with `addClientLog` or `clientLog` entries; avoid creating new wrappers.
|
||||
|
||||
---
|
||||
|
||||
### 5) localStorage helper duplication
|
||||
|
||||
```ts
|
||||
// client/src/contexts/storage.ts:11
|
||||
export function clearStoredProjectIds(): void {
|
||||
try {
|
||||
const keys = Object.keys(localStorage);
|
||||
...
|
||||
} catch (error) {
|
||||
console.warn('[Storage] Failed to clear stored project IDs:', error);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Callout**: Replace these ad-hoc functions with `storage-utils` functions and centralize logging there.
|
||||
|
||||
---
|
||||
|
||||
### 6) Unhandled storage write in preferences sync
|
||||
|
||||
```ts
|
||||
// client/src/lib/preferences-sync.ts:75
|
||||
function saveMeta(meta: PreferencesSyncMeta): void {
|
||||
if (typeof window === 'undefined') {
|
||||
return;
|
||||
}
|
||||
localStorage.setItem(PREFERENCES_SYNC_META_KEY, JSON.stringify(meta));
|
||||
for (const listener of listeners) {
|
||||
listener(meta);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Callout**: Wrap write in try/catch and emit a client log entry on failure.
|
||||
|
||||
---
|
||||
|
||||
### 7) Quiet trace context failures
|
||||
|
||||
```py
|
||||
# src/noteflow/infrastructure/logging/log_buffer.py:139
|
||||
try:
|
||||
from opentelemetry import trace
|
||||
...
|
||||
return trace_id, span_id
|
||||
except Exception:
|
||||
return None, None
|
||||
```
|
||||
|
||||
**Callout**: Add debug-level logging (rate-limited) so missing trace contexts are visible in logs.
|
||||
|
||||
---
|
||||
|
||||
## Remediation Plan (No New Wrappers)
|
||||
|
||||
### Task A: Centralize webhook fire-and-forget
|
||||
|
||||
**Files**
|
||||
- `src/noteflow/grpc/_mixins/errors/_webhooks.py`
|
||||
- `src/noteflow/grpc/_mixins/meeting/_stop_ops.py`
|
||||
- `src/noteflow/grpc/_mixins/summarization/_generation_mixin.py`
|
||||
- `src/noteflow/application/services/webhook_service.py`
|
||||
|
||||
**Instructions**
|
||||
- Replace local `try/except Exception` blocks with `fire_webhook_safe`.
|
||||
- If extra metadata is needed, extend `fire_webhook_safe` (same file) rather than adding a new helper.
|
||||
- Ensure each webhook call uses a consistent event name.
|
||||
|
||||
**Do Not**
|
||||
- Add a new wrapper helper in another module.
|
||||
- Keep old handlers alongside the centralized helper.
|
||||
|
||||
---
|
||||
|
||||
### Task B: Log non-critical client failures without UI changes
|
||||
|
||||
**Files**
|
||||
- `client/src/pages/Home.tsx`
|
||||
- `client/src/pages/MeetingDetail.tsx`
|
||||
- `client/src/pages/Tasks.tsx`
|
||||
- `client/src/pages/Settings.tsx`
|
||||
- `client/src/components/settings/ai-config-section.tsx`
|
||||
|
||||
**Instructions**
|
||||
- Use `addClientLog` or `clientLog` directly in `catch` blocks.
|
||||
- Keep the existing loading state / non-blocking behavior.
|
||||
- Use structured fields (meeting_id, context) when available.
|
||||
|
||||
**Do Not**
|
||||
- Add a new `reportNonCriticalError()` helper.
|
||||
- Add new UI toasts unless already used in that flow.
|
||||
|
||||
---
|
||||
|
||||
### Task C: Standardize localStorage access via `storage-utils`
|
||||
|
||||
**Files**
|
||||
- `client/src/lib/storage-utils.ts`
|
||||
- `client/src/contexts/storage.ts`
|
||||
- `client/src/contexts/workspace-context.tsx`
|
||||
- `client/src/contexts/project-context.tsx`
|
||||
- `client/src/lib/cache/meeting-cache.ts`
|
||||
- `client/src/lib/preferences-sync.ts`
|
||||
|
||||
**Instructions**
|
||||
- Refactor localStorage read/write/remove to use `readStorage`, `writeStorage`, `removeStorage`, `clearStorageByPrefix`.
|
||||
- Move any error logging into `storage-utils` (using `addClientLog`), not scattered sites.
|
||||
- Remove duplicated storage helpers once call sites are migrated.
|
||||
|
||||
**Do Not**
|
||||
- Introduce a compatibility layer that re-exports both old and new APIs.
|
||||
|
||||
---
|
||||
|
||||
### Task D: Harden preferences sync persistence
|
||||
|
||||
**Files**
|
||||
- `client/src/lib/preferences-sync.ts`
|
||||
|
||||
**Instructions**
|
||||
- Wrap `saveMeta()` storage writes with try/catch.
|
||||
- On failure, log a client warning and keep in-memory state in sync.
|
||||
|
||||
---
|
||||
|
||||
### Task E: Trace context visibility
|
||||
|
||||
**Files**
|
||||
- `src/noteflow/infrastructure/logging/log_buffer.py`
|
||||
|
||||
**Instructions**
|
||||
- Add a debug log when `_get_current_trace_context()` fails.
|
||||
- Rate-limit or guard to avoid log spam.
|
||||
|
||||
---
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- No `catch {}` blocks in user-facing flows without a log event (clientLog/addClientLog) or toast.
|
||||
- No duplicate webhook error handlers; all fire-and-forget hooks flow through a single helper.
|
||||
- All localStorage access paths use `storage-utils` (no duplicate helper functions).
|
||||
- No new compatibility layers or wrapper utilities were added.
|
||||
- Preferences sync storage cannot throw uncaught errors.
|
||||
- Trace context failures are visible via debug logs.
|
||||
- `make quality` passes.
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
This sprint is still an audit-driven remediation plan. It **must** avoid expanding scope beyond logging/error-handling consolidation.
|
||||
@@ -0,0 +1,338 @@
|
||||
# Implementation Checklist: SPRINT-GAP-012
|
||||
|
||||
Step-by-step implementation guide for state synchronization and observability fixes.
|
||||
|
||||
## Pre-Implementation
|
||||
|
||||
- [ ] Read and understand the full README.md
|
||||
- [ ] Ensure the dev backend is running (do **not** run Docker commands without explicit permission)
|
||||
- [ ] Run `make quality` to establish baseline (only if you plan code changes)
|
||||
|
||||
---
|
||||
|
||||
## Phase 1: Server Address Timing Fix (Priority: P1)
|
||||
|
||||
### 1.1 Add effectiveServerUrl Refresh After Connect
|
||||
|
||||
**File**: `client/src/pages/Settings.tsx`
|
||||
|
||||
Find `handleConnect()` (around line 230) and add URL refresh after successful connection:
|
||||
|
||||
```typescript
|
||||
const handleConnect = async () => {
|
||||
setIsConnecting(true);
|
||||
try {
|
||||
const normalized = normalizeServerInput(serverHost, serverPort);
|
||||
if (normalized.host !== serverHost) {
|
||||
setServerHost(normalized.host);
|
||||
}
|
||||
if (normalized.port !== serverPort) {
|
||||
setServerPort(normalized.port);
|
||||
}
|
||||
const hostValue = normalized.host;
|
||||
const portValue = normalized.port;
|
||||
localStorage.setItem('noteflow_server_address_override', 'true');
|
||||
localStorage.setItem(
|
||||
'noteflow_server_address_override_value',
|
||||
JSON.stringify({ host: hostValue, port: portValue, updated_at: Date.now() })
|
||||
);
|
||||
if (hostValue || portValue) {
|
||||
preferences.setServerConnection(hostValue, portValue);
|
||||
}
|
||||
const updatedPrefs = preferences.get();
|
||||
const api = isTauriEnvironment() ? await initializeTauriAPI() : getAPI();
|
||||
await api.savePreferences(updatedPrefs);
|
||||
const info = await api.connect(buildServerUrl(hostValue, portValue));
|
||||
setIsConnected(true);
|
||||
setServerInfo(info);
|
||||
|
||||
// ADD: Refresh effective URL after successful connection
|
||||
const urlInfo = await api.getEffectiveServerUrl();
|
||||
setEffectiveServerUrl(urlInfo);
|
||||
} catch (error) {
|
||||
setIsConnected(false);
|
||||
toastError({
|
||||
title: 'Connection Failed',
|
||||
error,
|
||||
fallback: 'Unable to reach server',
|
||||
});
|
||||
} finally {
|
||||
setIsConnecting(false);
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
**Verify**:
|
||||
- [ ] Connect to a new server address
|
||||
- [ ] Navigate to another page
|
||||
- [ ] Navigate back to Settings
|
||||
- [ ] Address should match what was saved
|
||||
|
||||
### 1.2 Add E2E Test for Address Persistence
|
||||
|
||||
**File**: `client/e2e/settings-ui.spec.ts` (extend existing suite)
|
||||
|
||||
```typescript
|
||||
import { test, expect } from '@playwright/test';
|
||||
import { navigateTo, waitForLoadingComplete } from './fixtures';
|
||||
|
||||
test.describe('Server Connection Section', () => {
|
||||
test('server address persists across navigation', async ({ page }) => {
|
||||
await navigateTo(page, '/settings?tab=status');
|
||||
await waitForLoadingComplete(page);
|
||||
|
||||
const hostInput = page.locator('#host');
|
||||
const portInput = page.locator('#port');
|
||||
|
||||
await hostInput.clear();
|
||||
await hostInput.fill('127.0.0.1');
|
||||
await portInput.clear();
|
||||
await portInput.fill('50051');
|
||||
|
||||
// Navigate away
|
||||
await page.goto('/');
|
||||
await page.waitForLoadState('networkidle');
|
||||
|
||||
await navigateTo(page, '/settings?tab=status');
|
||||
await waitForLoadingComplete(page);
|
||||
|
||||
// Verify values persisted
|
||||
await expect(hostInput).toHaveValue('127.0.0.1');
|
||||
await expect(portInput).toHaveValue('50051');
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
**Verify**:
|
||||
- [ ] Run `make e2e` or `npx playwright test settings-ui`
|
||||
- [ ] Tests pass
|
||||
|
||||
---
|
||||
|
||||
## Phase 2: OAuth State Consistency (Priority: P1)
|
||||
|
||||
### 2.1 Extend Existing Validation Helper
|
||||
|
||||
**File**: `client/src/lib/integration-utils.ts`
|
||||
|
||||
The file already has `hasRequiredIntegrationFields()`. Add OIDC case (and optionally a wrapper):
|
||||
|
||||
```typescript
|
||||
// In the existing switch statement, add OIDC case:
|
||||
export function hasRequiredIntegrationFields(integration: Integration): boolean {
|
||||
switch (integration.type) {
|
||||
case 'auth':
|
||||
return !!(integration.oauth_config?.client_id && integration.oauth_config?.client_secret);
|
||||
case 'email':
|
||||
return integration.email_config?.provider_type === 'api'
|
||||
? !!integration.email_config?.api_key
|
||||
: !!(integration.email_config?.smtp_host && integration.email_config?.smtp_username);
|
||||
case 'calendar':
|
||||
return !!(integration.oauth_config?.client_id && integration.oauth_config?.client_secret);
|
||||
case 'pkm':
|
||||
return !!(integration.pkm_config?.api_key || integration.pkm_config?.vault_path);
|
||||
case 'custom':
|
||||
return !!integration.webhook_config?.url;
|
||||
// ADD THIS CASE:
|
||||
case 'oidc':
|
||||
return !!(integration.oidc_config?.issuer_url && integration.oidc_config?.client_id);
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Optional convenience wrapper:
|
||||
/** Check if integration is connected AND has required credentials */
|
||||
export const isEffectivelyConnected = (integration: Integration): boolean =>
|
||||
integration.status === 'connected' && hasRequiredIntegrationFields(integration);
|
||||
```
|
||||
|
||||
**Verify**:
|
||||
- [ ] File compiles without errors
|
||||
- [ ] Run `npm run type-check`
|
||||
|
||||
### 2.2 Add Validation to handleIntegrationToggle
|
||||
|
||||
**File**: `client/src/components/settings/integrations-section.tsx`
|
||||
|
||||
Find `handleIntegrationToggle` (around line 122) and add validation:
|
||||
|
||||
```typescript
|
||||
import { hasRequiredIntegrationFields } from '@/lib/integration-utils';
|
||||
|
||||
const handleIntegrationToggle = (integration: Integration) => {
|
||||
// ADD: Validate before allowing connect
|
||||
if (integration.status === ConnStatus.DISCONNECTED) {
|
||||
if (!hasRequiredIntegrationFields(integration)) {
|
||||
toast({
|
||||
title: 'Missing credentials',
|
||||
description: `Configure ${integration.name} credentials before connecting`,
|
||||
variant: 'destructive',
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
const newStatus =
|
||||
integration.status === ConnStatus.CONNECTED ? ConnStatus.DISCONNECTED : ConnStatus.CONNECTED;
|
||||
preferences.updateIntegration(integration.id, { status: newStatus });
|
||||
setIntegrations(preferences.getIntegrations());
|
||||
toast({
|
||||
title: newStatus === ConnStatus.CONNECTED ? 'Connected' : 'Disconnected',
|
||||
description: `${integration.name} ${newStatus}`,
|
||||
});
|
||||
};
|
||||
```
|
||||
|
||||
**Verify**:
|
||||
- [ ] Toggling integration without credentials shows error toast
|
||||
- [ ] Toggling integration with valid credentials succeeds
|
||||
- [ ] Disconnecting always works (no validation needed)
|
||||
|
||||
### 2.3 Add Credential Warning Indicator (Optional Enhancement)
|
||||
|
||||
**File**: Find the integration card/list component
|
||||
|
||||
Add visual indicator for "connected" integrations missing credentials:
|
||||
|
||||
```typescript
|
||||
import { AlertTriangle } from 'lucide-react';
|
||||
import { hasRequiredIntegrationFields } from '@/lib/integration-utils';
|
||||
|
||||
// Near the status indicator:
|
||||
{integration.status === 'connected' && !hasRequiredIntegrationFields(integration) && (
|
||||
<Tooltip>
|
||||
<TooltipTrigger>
|
||||
<AlertTriangle className="h-4 w-4 text-warning" />
|
||||
</TooltipTrigger>
|
||||
<TooltipContent>
|
||||
Credentials missing - integration may not work correctly
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
)}
|
||||
```
|
||||
|
||||
**Verify**:
|
||||
- [ ] Warning icon appears on misconfigured integrations
|
||||
- [ ] Tooltip provides helpful guidance
|
||||
|
||||
---
|
||||
|
||||
## Phase 3: E2E Test Enrichment (Priority: P2)
|
||||
|
||||
Use existing fixtures from `client/e2e/fixtures.ts` (`callAPI`, `navigateTo`, `waitForAPI`, `waitForLoadingComplete`).
|
||||
|
||||
### 3.1 Add Integration Validation Test
|
||||
|
||||
**File**: `client/e2e/ui-integration.spec.ts` or `client/e2e/settings-ui.spec.ts` (extend existing suite)
|
||||
|
||||
```typescript
|
||||
import { test, expect } from '@playwright/test';
|
||||
import { navigateTo, waitForLoadingComplete } from './fixtures';
|
||||
|
||||
const shouldRun = process.env.NOTEFLOW_E2E === '1';
|
||||
|
||||
test.describe('Integration Validation', () => {
|
||||
test.skip(!shouldRun, 'Set NOTEFLOW_E2E=1 to enable end-to-end tests.');
|
||||
|
||||
test('integration without credentials warns on connect', async ({ page }) => {
|
||||
await navigateTo(page, '/settings?tab=integrations');
|
||||
await waitForLoadingComplete(page);
|
||||
|
||||
const toggleButton = page.locator('button, [role=\"switch\"]').filter({ hasText: /connect/i });
|
||||
if (await toggleButton.first().isVisible()) {
|
||||
await toggleButton.first().click();
|
||||
const toast = page.locator('[role=\"alert\"]');
|
||||
await expect(toast).toBeVisible();
|
||||
}
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
**Verify**:
|
||||
- [ ] Run `NOTEFLOW_E2E=1 npx playwright test ui-integration settings-ui`
|
||||
- [ ] Tests pass
|
||||
|
||||
---
|
||||
|
||||
## Phase 4: Logging Improvements (Priority: P2)
|
||||
|
||||
### 4.1 Audit contextlib.suppress Usage
|
||||
|
||||
Run this search to find all suppress usages:
|
||||
|
||||
```bash
|
||||
rg -n "contextlib\\.suppress" src/noteflow
|
||||
```
|
||||
|
||||
For each occurrence, evaluate:
|
||||
- [ ] Is the exception being logged before suppression?
|
||||
- [ ] Should this be a try/except with logging instead?
|
||||
|
||||
### 4.2 Add Logging to Silent Returns
|
||||
|
||||
Search for silent returns in critical paths:
|
||||
|
||||
```bash
|
||||
rg -n "return$" src/noteflow/grpc/_mixins | head -20
|
||||
```
|
||||
|
||||
For each early return:
|
||||
- [ ] Add `logger.debug()` explaining why the return happened
|
||||
- [ ] Include relevant context (IDs, state)
|
||||
|
||||
### 4.3 Document Logging Standards
|
||||
|
||||
**File**: `docs/observability.md` (append section) or `docs/logging-standards.md` (new file under docs/)
|
||||
|
||||
```markdown
|
||||
## Logging Standards
|
||||
|
||||
### Principles
|
||||
|
||||
1. **Never suppress without logging**: Replace `contextlib.suppress()` with explicit try/except
|
||||
2. **Log early returns**: Any function that returns early should log at DEBUG level
|
||||
3. **Structured logging**: Always include context (IDs, operation names)
|
||||
4. **Appropriate levels**:
|
||||
- ERROR: Unexpected failures
|
||||
- WARNING: Expected failures (validation errors)
|
||||
- INFO: Significant state changes
|
||||
- DEBUG: Operational flow details
|
||||
|
||||
### Examples
|
||||
|
||||
#### Before (Bad)
|
||||
```python
|
||||
with contextlib.suppress(Exception):
|
||||
await risky_operation()
|
||||
```
|
||||
|
||||
#### After (Good)
|
||||
```python
|
||||
try:
|
||||
await risky_operation()
|
||||
except SpecificError as e:
|
||||
logger.warning("Expected error in operation", error=str(e))
|
||||
except Exception as e:
|
||||
logger.error("Unexpected error in operation", error=str(e), exc_info=True)
|
||||
```
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Validation Checklist
|
||||
|
||||
After completing all phases:
|
||||
|
||||
- [ ] `make quality` passes (when code changes are made)
|
||||
- [ ] `make e2e` passes (or tests appropriately skipped when added)
|
||||
- [ ] Server address persists across navigation (manual test)
|
||||
- [ ] Integration warning badges appear correctly
|
||||
- [ ] No new TypeScript or Python lint errors
|
||||
|
||||
## Rollback Plan
|
||||
|
||||
If issues arise:
|
||||
1. Revert changes to `Settings.tsx`
|
||||
2. Remove new test files (they don't affect production)
|
||||
3. Keep documentation updates
|
||||
@@ -0,0 +1,568 @@
|
||||
# SPRINT-GAP-012: State Synchronization & Observability Gaps
|
||||
|
||||
| Attribute | Value |
|
||||
|-----------|-------|
|
||||
| **Sprint** | GAP-012 |
|
||||
| **Size** | L (Large) |
|
||||
| **Owner** | TBD |
|
||||
| **Phase** | Hardening |
|
||||
| **Prerequisites** | None |
|
||||
|
||||
## Executive Summary
|
||||
|
||||
Investigation revealed multiple gaps in state synchronization, observability, and test coverage that cause confusing user experiences and difficult debugging:
|
||||
|
||||
1. **Server Address Timing Bug**: After saving a new server address and navigating away, the displayed address resets to the previous value despite successful persistence
|
||||
2. **OAuth State Consistency**: Integrations display "connected" status with blank OAuth fields, causing silent sync failures
|
||||
3. **Server Logging Gaps**: Silent failures throughout the codebase make debugging impossible
|
||||
4. **E2E Test Coverage**: Tests validate structure but not functional round-trips
|
||||
|
||||
## Current Status (reviewed January 9, 2026)
|
||||
|
||||
- **Still open**: Server address timing/synchronization bug (effective URL not refreshed after connect).
|
||||
- **Still open**: Integration toggle can mark "connected" without required credentials; OIDC validation is missing in `hasRequiredIntegrationFields()`.
|
||||
- **Partially addressed**: Logging gaps — ASR and webhook paths now log more failures, but standards/audit work remains.
|
||||
- **Partially addressed**: E2E coverage improved (connection + settings UI), but no persistence or lifecycle tests.
|
||||
|
||||
## Open Issues
|
||||
|
||||
- [ ] Define canonical source of truth for server address (local override vs preferences vs Rust state vs gRPC client)
|
||||
- [ ] Determine OAuth credential loading strategy (eager vs lazy)
|
||||
- [ ] Establish logging standards for failure modes (doc location + conventions)
|
||||
- [ ] Define e2e test functional coverage requirements (persistence, lifecycle, error recovery)
|
||||
|
||||
---
|
||||
|
||||
## Issue 1: Server Address Timing/Synchronization Bug
|
||||
|
||||
### Symptom
|
||||
|
||||
User saves server address `127.0.0.1:50051`, connection succeeds, but navigating to another page and back shows `192.168.50.151:50051` (previous value).
|
||||
|
||||
### Root Cause Analysis
|
||||
|
||||
The server address exists in **multiple locations** with different update timings:
|
||||
|
||||
| Location | Update Trigger | Read On |
|
||||
|----------|---------------|---------|
|
||||
| TypeScript localStorage (`noteflow_preferences`) | `preferences.setServerConnection()` | Page mount via `preferences.get()` |
|
||||
| Local override (`noteflow_server_address_override_value`) | `handleConnect()` | `preferences.get()` hydration |
|
||||
| Rust `AppState.preferences` | `save_preferences` IPC command | `get_effective_server_url` IPC command |
|
||||
| gRPC client internal `endpoint` | `connect()` IPC command | Connection health checks |
|
||||
|
||||
### Timing Gap in `handleConnect()`
|
||||
|
||||
**File**: `client/src/pages/Settings.tsx` (lines 260-289)
|
||||
|
||||
```typescript
|
||||
const handleConnect = async () => {
|
||||
setIsConnecting(true);
|
||||
try {
|
||||
const normalized = normalizeServerInput(serverHost, serverPort);
|
||||
if (normalized.host !== serverHost) setServerHost(normalized.host);
|
||||
if (normalized.port !== serverPort) setServerPort(normalized.port);
|
||||
|
||||
// Local override + preferences
|
||||
localStorage.setItem('noteflow_server_address_override', 'true');
|
||||
localStorage.setItem(
|
||||
'noteflow_server_address_override_value',
|
||||
JSON.stringify({ host: normalized.host, port: normalized.port, updated_at: Date.now() })
|
||||
);
|
||||
preferences.setServerConnection(normalized.host, normalized.port);
|
||||
|
||||
// Persist to Rust state (Tauri) and connect via gRPC
|
||||
const api = isTauriEnvironment() ? await initializeTauriAPI() : getAPI();
|
||||
await api.savePreferences(preferences.get());
|
||||
const info = await api.connect(buildServerUrl(normalized.host, normalized.port));
|
||||
|
||||
setIsConnected(true);
|
||||
setServerInfo(info);
|
||||
// BUG: effectiveServerUrl is NOT re-fetched here
|
||||
} catch (error) {
|
||||
// ...
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
### Critical Bug
|
||||
|
||||
After `handleConnect()` completes:
|
||||
- `effectiveServerUrl` state still holds the **old** value from mount
|
||||
- When user navigates away and the component unmounts, then returns:
|
||||
- `checkConnection()` fetches `effectiveServerUrl` from Rust state
|
||||
- Rust state may prefer `server_address_customized` (from preferences) over env/default
|
||||
|
||||
**File**: `client/src-tauri/src/commands/connection.rs` (lines 100-121)
|
||||
|
||||
```rust
|
||||
pub fn get_effective_server_url(state: State<'_, Arc<AppState>>) -> EffectiveServerUrl {
|
||||
let prefs = state.preferences.read();
|
||||
let cfg = config();
|
||||
let prefs_url = format!("{}:{}", prefs.server_host, prefs.server_port);
|
||||
|
||||
// If preferences explicitly customized, use them
|
||||
if prefs.server_address_customized && !prefs.server_host.is_empty() {
|
||||
return EffectiveServerUrl {
|
||||
url: prefs_url,
|
||||
source: ServerAddressSource::Preferences,
|
||||
};
|
||||
}
|
||||
|
||||
EffectiveServerUrl {
|
||||
url: cfg.server.default_address.clone(),
|
||||
source: cfg.server.address_source,
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Proposed Fix
|
||||
|
||||
**After successful connect, re-fetch `effectiveServerUrl`**:
|
||||
|
||||
```typescript
|
||||
// In handleConnect(), after setServerInfo(info):
|
||||
const urlInfo = await api.getEffectiveServerUrl();
|
||||
setEffectiveServerUrl(urlInfo);
|
||||
```
|
||||
|
||||
This ensures the displayed URL reflects the actual connected state after all async operations complete.
|
||||
|
||||
### Validation Test (UI)
|
||||
|
||||
```typescript
|
||||
// e2e test to add
|
||||
test('server address persists across navigation', async ({ page }) => {
|
||||
await page.goto('/settings?tab=status');
|
||||
await page.fill('#host', '127.0.0.1');
|
||||
await page.fill('#port', '50051');
|
||||
|
||||
// Navigate away and back
|
||||
await page.goto('/');
|
||||
await page.goto('/settings?tab=status');
|
||||
|
||||
// Verify persisted values
|
||||
await expect(page.locator('#host')).toHaveValue('127.0.0.1');
|
||||
await expect(page.locator('#port')).toHaveValue('50051');
|
||||
});
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Issue 2: OAuth/Integration State Consistency
|
||||
|
||||
### Symptom
|
||||
|
||||
Integrations display "connected" status without valid credentials, causing:
|
||||
1. UI confusion (users see "connected" but credentials are missing)
|
||||
2. Potential failures when OAuth operations are attempted
|
||||
|
||||
**Note**: The sync scheduler only processes `calendar` and `pkm` integration types, so auth/oidc integrations with blank credentials don't cause sync failures directly.
|
||||
|
||||
### Root Cause Analysis
|
||||
|
||||
Primary vulnerability and contributing factors:
|
||||
|
||||
#### 2.0 Direct Status Toggle Without Validation (PRIMARY)
|
||||
|
||||
**File**: `client/src/components/settings/integrations-section.tsx` (lines 122-131)
|
||||
|
||||
```typescript
|
||||
const handleIntegrationToggle = (integration: Integration) => {
|
||||
const newStatus =
|
||||
integration.status === ConnStatus.CONNECTED ? ConnStatus.DISCONNECTED : ConnStatus.CONNECTED;
|
||||
preferences.updateIntegration(integration.id, { status: newStatus });
|
||||
// NO VALIDATION - allows "connected" without credentials
|
||||
};
|
||||
```
|
||||
|
||||
This is the primary entry point for invalid state. Users can toggle integrations to "connected" without any credential validation.
|
||||
|
||||
#### 2.1 Empty OAuth Defaults
|
||||
|
||||
**File**: `client/src/lib/default-integrations.ts` (lines 21, 41-67)
|
||||
|
||||
```typescript
|
||||
const emptyOAuth = { client_id: '', client_secret: '', redirect_uri: '', scopes: [] as string[] };
|
||||
|
||||
// Used in:
|
||||
createIntegration('Google SSO', 'auth', {
|
||||
oauth_config: { ...emptyOAuth, scopes: ['openid', 'email', 'profile'] },
|
||||
}),
|
||||
```
|
||||
|
||||
Integrations start with empty credentials by design, but status can become "connected" through various paths.
|
||||
|
||||
#### 2.2 Status Updated Without Credential Validation
|
||||
|
||||
**File**: `client/src/lib/preferences.ts` (updateIntegration function)
|
||||
|
||||
The `updateIntegration()` function merges partial updates without validating that required fields for "connected" status are present:
|
||||
|
||||
```typescript
|
||||
updateIntegration(id: string, updates: Partial<Integration>): void {
|
||||
withPreferences((prefs) => {
|
||||
const idx = prefs.integrations.findIndex((i) => i.id === id);
|
||||
if (idx !== -1) {
|
||||
prefs.integrations[idx] = { ...prefs.integrations[idx], ...updates };
|
||||
// No validation that connected status has valid oauth_config
|
||||
}
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
#### 2.3 Type System Allows Invalid States
|
||||
|
||||
**File**: `client/src/api/types/requests/integrations.ts`
|
||||
|
||||
The `Integration` type allows `status: 'connected'` without requiring any config to be populated:
|
||||
|
||||
```typescript
|
||||
interface Integration {
|
||||
id: string;
|
||||
name: string;
|
||||
type: IntegrationType;
|
||||
status: IntegrationStatus; // Can be 'connected'
|
||||
oauth_config?: OAuthConfig; // Optional - allows empty
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
#### 2.4 Secrets Stored Separately from Integration Objects
|
||||
|
||||
OAuth secrets are encrypted and stored separately via `useSecureIntegrationSecrets`, but the integration object may be loaded without secrets:
|
||||
|
||||
```typescript
|
||||
// In Settings.tsx loadEncryptedApiKeys():
|
||||
const integrationsWithSecrets = await loadAllSecrets(preferences.getIntegrations());
|
||||
setIntegrations(integrationsWithSecrets);
|
||||
// If loadAllSecrets fails silently, integrations remain without credentials
|
||||
```
|
||||
|
||||
#### 2.5 Backend Separation of Secrets
|
||||
|
||||
The backend stores OAuth secrets separately from Integration records (IntegrationSecretModel). When fetching integrations, secrets may not be joined, leading to "connected" integrations without credentials.
|
||||
|
||||
#### 2.6 Sync Trigger Doesn't Validate Credentials
|
||||
|
||||
**File**: `client/src/hooks/use-integration-sync.ts`
|
||||
|
||||
The scheduler only runs for `calendar` and `pkm` integrations with `integration_id`, but status can still be set to `connected` without required credentials for other integration types, leading to UI confusion and broken flows.
|
||||
|
||||
```typescript
|
||||
// In triggerSync():
|
||||
if (integration.status !== 'connected') {
|
||||
return { success: false, error: 'Integration not connected' };
|
||||
}
|
||||
// No check for oauth_config.client_id presence
|
||||
```
|
||||
|
||||
#### 2.7 Graceful Error Handling Masks Root Cause
|
||||
|
||||
Sync failures are caught and displayed as toast errors, but the underlying "connected without credentials" state persists, causing repeated failures.
|
||||
|
||||
### Proposed Fixes
|
||||
|
||||
1. **Extend existing validation helper** (`client/src/lib/integration-utils.ts`):
|
||||
|
||||
The file already has `hasRequiredIntegrationFields()` - extend it to add OIDC support and a convenience wrapper:
|
||||
```typescript
|
||||
// Add OIDC case to existing switch statement
|
||||
case 'oidc':
|
||||
return !!(integration.oidc_config?.issuer_url && integration.oidc_config?.client_id);
|
||||
|
||||
// Add convenience wrapper below existing function
|
||||
export const isEffectivelyConnected = (integration: Integration): boolean =>
|
||||
integration.status === 'connected' && hasRequiredIntegrationFields(integration);
|
||||
```
|
||||
|
||||
2. **Validate in handleIntegrationToggle** (`integrations-section.tsx`):
|
||||
```typescript
|
||||
import { hasRequiredIntegrationFields } from '@/lib/integration-utils';
|
||||
|
||||
const handleIntegrationToggle = (integration: Integration) => {
|
||||
if (integration.status === ConnStatus.DISCONNECTED) {
|
||||
// Trying to connect - validate first
|
||||
if (!hasRequiredIntegrationFields(integration)) {
|
||||
toast({
|
||||
title: 'Missing credentials',
|
||||
description: `Configure ${integration.name} credentials before connecting`,
|
||||
variant: 'destructive',
|
||||
});
|
||||
return;
|
||||
}
|
||||
}
|
||||
const newStatus =
|
||||
integration.status === ConnStatus.CONNECTED ? ConnStatus.DISCONNECTED : ConnStatus.CONNECTED;
|
||||
preferences.updateIntegration(integration.id, { status: newStatus });
|
||||
// ... rest unchanged
|
||||
};
|
||||
```
|
||||
|
||||
3. **UI indicator for missing credentials**:
|
||||
Display warning badge on integrations that are "connected" but fail `hasRequiredIntegrationFields()`.
|
||||
|
||||
### Affected Integrations
|
||||
|
||||
| Integration | OAuth Required | Credential Source |
|
||||
|-------------|---------------|-------------------|
|
||||
| Google SSO | Yes | User-provided |
|
||||
| Microsoft 365 | Yes | User-provided |
|
||||
| Google Calendar | Yes | OAuth flow |
|
||||
| Outlook Calendar | Yes | OAuth flow |
|
||||
| Authentik (OIDC) | Yes | User-provided |
|
||||
| Keycloak (OIDC) | Yes | User-provided |
|
||||
|
||||
---
|
||||
|
||||
## Issue 3: Server Logging Gaps
|
||||
|
||||
### Symptom
|
||||
|
||||
Logging coverage has improved, but some failure paths still lack consistent, structured logging standards.
|
||||
|
||||
### Identified Logging Gaps
|
||||
|
||||
| Location | Pattern | Impact |
|
||||
|----------|---------|--------|
|
||||
| Broad exception handlers | Missing context fields or inconsistent levels | Root cause analysis harder |
|
||||
| Early returns in hot paths | No debug logs on skipped work | Operational flow unclear |
|
||||
| Background tasks | Missing start/finish logs | Difficult to trace async failures |
|
||||
|
||||
### Examples (Updated)
|
||||
|
||||
#### 3.1 ASR Processing (now logs)
|
||||
|
||||
```python
|
||||
# src/noteflow/grpc/_mixins/streaming/_asr.py (approximate)
|
||||
async def _process_audio_chunk(self, chunk: bytes) -> None:
|
||||
if not self._asr_engine:
|
||||
logger.error("ASR engine unavailable during segment processing", ...)
|
||||
# ...
|
||||
```
|
||||
|
||||
**Fix**:
|
||||
```python
|
||||
async def _process_audio_chunk(self, chunk: bytes) -> None:
|
||||
if not self._asr_engine:
|
||||
logger.debug("ASR engine unavailable, skipping chunk processing")
|
||||
return
|
||||
# ...
|
||||
```
|
||||
|
||||
#### 3.2 contextlib.suppress (now scoped)
|
||||
|
||||
```python
|
||||
# Remaining usages are limited to task cancellation paths (acceptable).
|
||||
```
|
||||
|
||||
**Fix**: Replace with explicit exception handling:
|
||||
```python
|
||||
try:
|
||||
await potentially_failing_operation()
|
||||
except SpecificException as e:
|
||||
logger.warning("Operation failed", error=str(e), exc_info=True)
|
||||
except Exception as e:
|
||||
logger.error("Unexpected failure in operation", error=str(e), exc_info=True)
|
||||
```
|
||||
|
||||
#### 3.3 Webhook Delivery (now logs)
|
||||
|
||||
```python
|
||||
# Background webhook delivery
|
||||
try:
|
||||
delivery = await self._executor.deliver(...)
|
||||
except Exception:
|
||||
_logger.exception("Unexpected error delivering webhook ...")
|
||||
```
|
||||
|
||||
**Fix**:
|
||||
```python
|
||||
async def _deliver_with_logging(self, config, payload):
|
||||
try:
|
||||
await self._deliver_webhook(config, payload)
|
||||
except Exception as e:
|
||||
logger.error("Webhook delivery failed",
|
||||
webhook_id=config.id,
|
||||
error=str(e),
|
||||
exc_info=True)
|
||||
|
||||
asyncio.create_task(self._deliver_with_logging(config, payload))
|
||||
```
|
||||
|
||||
### Proposed Logging Standards (Still Needed)
|
||||
|
||||
1. **Always log on early returns**: Any function that returns early due to missing prerequisites should log at DEBUG level
|
||||
2. **Never suppress without logging**: Replace `contextlib.suppress()` with try/except that logs
|
||||
3. **Log background task lifecycle**: Log when background tasks start and when they complete/fail
|
||||
4. **Structured logging for all failures**: Include context (IDs, operation names, parameters)
|
||||
5. **Log at appropriate levels**:
|
||||
- ERROR: Unexpected failures that need investigation
|
||||
- WARNING: Expected failures (validation, user errors)
|
||||
- INFO: Significant state changes
|
||||
- DEBUG: Detailed operational flow
|
||||
|
||||
### Validation
|
||||
|
||||
Add log assertion tests:
|
||||
```python
|
||||
def test_asr_unavailable_logs_debug(caplog):
|
||||
with caplog.at_level(logging.DEBUG):
|
||||
await process_chunk_without_asr()
|
||||
assert "ASR engine unavailable" in caplog.text
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Issue 4: E2E Test Coverage Gaps
|
||||
|
||||
### Current State
|
||||
|
||||
- Playwright tests exist in `client/e2e/` (connection, settings UI, OAuth/OIDC, etc.)
|
||||
- `connection.spec.ts` and `settings-ui.spec.ts` validate basic API round-trips and UI structure
|
||||
- No persistence or lifecycle tests for server address, connection states, or integration validation
|
||||
|
||||
### Coverage Gaps
|
||||
|
||||
| Gap | Description | Risk |
|
||||
|-----|-------------|------|
|
||||
| No state persistence tests | Navigation doesn't verify local override + prefs sync | State bugs undetected |
|
||||
| No connection lifecycle tests | Connect/disconnect/reconnect not tested | Connection bugs undetected |
|
||||
| No integration validation tests | Toggle/connect without creds not tested | Credential bugs undetected |
|
||||
| Limited error recovery tests | Mostly happy path | Error handling untested |
|
||||
| Limited sync operation tests | Calendar/integration sync untested | Sync failures undetected |
|
||||
|
||||
### Recommended Test Additions
|
||||
|
||||
#### 4.1 Connection Persistence Test
|
||||
|
||||
```typescript
|
||||
// client/e2e/connection-roundtrip.spec.ts
|
||||
import { test, expect } from './fixtures';
|
||||
import { callAPI, navigateTo, waitForAPI } from './fixtures';
|
||||
|
||||
test.describe('Connection Round-Trip', () => {
|
||||
test.beforeEach(async ({ page }) => {
|
||||
await navigateTo(page, '/settings?tab=status');
|
||||
await waitForAPI(page);
|
||||
});
|
||||
|
||||
test('persists connection across navigation', async ({ page }) => {
|
||||
await page.fill('#host', '127.0.0.1');
|
||||
await page.fill('#port', '50051');
|
||||
|
||||
// Navigate away
|
||||
await page.goto('/meetings');
|
||||
await page.waitForLoadState('networkidle');
|
||||
|
||||
// Navigate back
|
||||
await page.goto('/settings?tab=status');
|
||||
|
||||
// Should still show saved address
|
||||
await expect(page.locator('#host')).toHaveValue('127.0.0.1');
|
||||
await expect(page.locator('#port')).toHaveValue('50051');
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
#### 4.2 Integration Validation Test
|
||||
|
||||
```typescript
|
||||
// client/e2e/integration-state.spec.ts
|
||||
import { test, expect } from './fixtures';
|
||||
import { navigateTo, waitForAPI } from './fixtures';
|
||||
|
||||
test.describe('Integration State Consistency', () => {
|
||||
test.beforeEach(async ({ page }) => {
|
||||
await navigateTo(page, '/settings?tab=integrations');
|
||||
await waitForAPI(page);
|
||||
});
|
||||
|
||||
test('integrations tab loads without errors', async ({ page }) => {
|
||||
// Verify tab content is visible
|
||||
await expect(page.locator('.settings-tab-content')).toBeVisible();
|
||||
// No error toasts should appear on load
|
||||
const errorToast = page.locator('[role="alert"]').filter({ hasText: /error|failed/i });
|
||||
await expect(errorToast).not.toBeVisible({ timeout: 3000 }).catch(() => {});
|
||||
});
|
||||
|
||||
test('integration without credentials shows warning when toggled', async ({ page }) => {
|
||||
// Find any integration toggle button
|
||||
const toggleButton = page.locator('button').filter({ hasText: /connect/i }).first();
|
||||
|
||||
if (await toggleButton.isVisible()) {
|
||||
await toggleButton.click();
|
||||
|
||||
// Should show validation toast if credentials missing
|
||||
// (after fix is implemented)
|
||||
const toast = page.locator('[role="alert"]');
|
||||
// Either success or "missing credentials" warning
|
||||
await expect(toast).toBeVisible({ timeout: 5000 }).catch(() => {});
|
||||
}
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
#### 4.3 Connection Lifecycle Test (Optional)
|
||||
|
||||
```typescript
|
||||
// client/e2e/recording-roundtrip.spec.ts
|
||||
import { test, expect } from './fixtures';
|
||||
import { callAPI, navigateTo, waitForAPI } from './fixtures';
|
||||
|
||||
test.describe('Recording Round-Trip', () => {
|
||||
test.skip(true, 'Requires running backend with audio capture');
|
||||
|
||||
test('recording page loads', async ({ page }) => {
|
||||
await navigateTo(page, '/recording');
|
||||
await waitForAPI(page);
|
||||
|
||||
// Verify recording UI is present
|
||||
const recordButton = page.locator('button').filter({ hasText: /start|record/i });
|
||||
await expect(recordButton).toBeVisible();
|
||||
});
|
||||
|
||||
test('can list meetings via API', async ({ page }) => {
|
||||
await navigateTo(page, '/');
|
||||
await waitForAPI(page);
|
||||
|
||||
// Verify API round-trip works
|
||||
const meetings = await callAPI(page, 'listMeetings');
|
||||
expect(Array.isArray(meetings)).toBe(true);
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Test Infrastructure
|
||||
|
||||
The existing `client/e2e/fixtures.ts` already provides:
|
||||
- `callAPI<T>(page, method, ...args)` - Call API methods and return typed results
|
||||
- `navigateTo(page, path)` - Navigate with proper waiting
|
||||
- `waitForAPI(page)` - Wait for API to be available
|
||||
- `getConnectionState(page)` - Get current connection state
|
||||
|
||||
These fixtures should be used consistently across all new tests.
|
||||
|
||||
---
|
||||
|
||||
## Implementation Priority
|
||||
|
||||
| Issue | Priority | Effort | Impact |
|
||||
|-------|----------|--------|--------|
|
||||
| Server Address Timing | P1 | Small | High - User-facing bug |
|
||||
| OAuth State Consistency | P1 | Medium | High - Silent failures |
|
||||
| E2E Persistence + Validation Tests | P2 | Medium | Medium - Regression prevention |
|
||||
| Logging Standards + Audit | P3 | Medium | Medium - Debugging capability |
|
||||
|
||||
## Success Criteria
|
||||
|
||||
1. Server address persists correctly across navigation and effective URL tooltip updates after connect
|
||||
2. Integrations cannot reach "connected" status without valid credentials (including OIDC)
|
||||
3. E2E tests cover persistence and integration validation
|
||||
4. Logging standards documented and applied consistently
|
||||
5. `make quality` passes when code changes are made
|
||||
|
||||
## References
|
||||
|
||||
- GAP-003: Error Handling Patterns
|
||||
- GAP-004: Diarization Lifecycle
|
||||
- GAP-011: Post-Processing Pipeline
|
||||
589
docs/sprints/phase-ongoing/.archive/wiring.md
Normal file
589
docs/sprints/phase-ongoing/.archive/wiring.md
Normal file
@@ -0,0 +1,589 @@
|
||||
# Cross-Stack Wiring Gaps Analysis
|
||||
|
||||
> **Generated:** 2026-01-05
|
||||
> **Last Updated:** 2026-01-05
|
||||
> **Scope:** Frontend (TypeScript), Backend (Python), Desktop Client (Rust/Tauri)
|
||||
|
||||
This document identifies areas where the gRPC proto contract is not fully wired across all three layers.
|
||||
|
||||
---
|
||||
|
||||
## Core Features Status
|
||||
|
||||
The following features have their **individual RPCs wired** across all layers, but several lack **automatic orchestration** - they require manual user action to trigger.
|
||||
|
||||
| Feature | Proto RPC | Python Mixin | Rust Client | Rust Commands | TS Adapter | Orchestration | Notes |
|
||||
|---------|-----------|--------------|-------------|---------------|------------|---------------|-------|
|
||||
| **Transcription** | StreamTranscription | StreamingMixin | streaming/ | recording/ | ✅ | ✅ Auto | Real-time during recording |
|
||||
| **Summaries** | GenerateSummary | SummarizationMixin | meetings.rs | summary.rs | ✅ | ⚠️ Manual | **GAP-W05**: No auto-trigger after recording |
|
||||
| **Action Items** | (via Summary) | (via Summary) | (via Summary) | (via Summary) | ✅ | ⚠️ Manual | Depends on summary generation |
|
||||
| **Annotations** | Add/Get/List/Update/Delete | AnnotationMixin | annotations.rs | annotation.rs | ✅ | ✅ User-driven | CRUD by design |
|
||||
| **Preferences** | Get/SetPreferences | PreferencesMixin | preferences.rs | preferences.rs | ✅ | ✅ Auto | Sync on change |
|
||||
| **Meetings** | Create/Get/List/Stop/Delete | MeetingMixin | meetings.rs | meeting.rs | ✅ | ✅ Auto | Full lifecycle |
|
||||
| **Export** | ExportTranscript | ExportMixin | annotations.rs | export.rs | ✅ | ✅ User-driven | On-demand by design |
|
||||
| **Diarization** | Refine/Rename/Status/Cancel | DiarizationMixin | diarization.rs | diarization.rs | ✅ | ⚠️ Manual | **GAP-W05**: No auto-trigger after recording |
|
||||
| **Entities (NER)** | Extract/Update/Delete | EntitiesMixin | annotations.rs | entities.rs | ✅ | ⚠️ Manual | **GAP-W05**: Auto-extract disabled |
|
||||
| **Calendar** | List/Providers/OAuth | CalendarMixin | calendar.rs | calendar.rs | ✅ | ✅ Auto | Sync scheduled |
|
||||
| **Webhooks** | Register/List/Update/Delete | WebhooksMixin | webhooks.rs | webhooks.rs | ✅ | ✅ Auto | Fire on events |
|
||||
| **Projects** | Full CRUD + Membership | ProjectMixin | projects.rs | projects.rs | ✅ | ✅ User-driven | CRUD by design |
|
||||
| **Identity** | GetCurrentUser | IdentityMixin | identity.rs | identity.rs | ✅ | ✅ Auto | On connection |
|
||||
| **Observability** | Logs/Metrics | ObservabilityMixin | observability.rs | observability.rs | ✅ | ✅ Auto | Background |
|
||||
| **Sync** | Start/Status/History | SyncMixin | sync.rs | sync.rs | ✅ | ✅ Auto | Scheduled |
|
||||
|
||||
**Legend:**
|
||||
- ✅ **Auto**: Triggers automatically at the appropriate time
|
||||
- ✅ **User-driven**: Requires user action by design (CRUD, export, etc.)
|
||||
- ⚠️ **Manual**: Requires manual trigger but should be automatic
|
||||
|
||||
### Tasks Page Note
|
||||
|
||||
The "Tasks" feature (`/tasks` page) aggregates **action items from summaries** across meetings. There is no separate Task entity in the proto - tasks are derived from `Summary.action_items` which includes:
|
||||
- `text`: Action item description
|
||||
- `assignee`: Person assigned (if mentioned)
|
||||
- `due_date`: Due date (if mentioned)
|
||||
- `priority`: Priority level (high/medium/low)
|
||||
- `segment_ids`: Links to transcript segments
|
||||
|
||||
Task completion status is stored in local preferences (`completed_tasks`).
|
||||
|
||||
**Note:** Tasks only appear after summaries are generated, which currently requires manual triggering (see GAP-W05).
|
||||
|
||||
---
|
||||
|
||||
## Identified Gaps
|
||||
|
||||
## Summary
|
||||
|
||||
| Gap ID | Feature | Severity | Backend | Rust Client | TS Adapter | Desktop Testing |
|
||||
|--------|---------|----------|---------|-------------|------------|-----------------|
|
||||
| GAP-W05 | Post-Processing Orchestration | ✅ **Complete** | ✅ | ✅ | ✅ | Verify Desktop |
|
||||
| GAP-W01 | OIDC Provider Management | ✅ **Complete** | ✅ | ✅ | ✅ | Required |
|
||||
| GAP-W02 | Workspace gRPC Methods | Informational | ✅ | ⚠️ Fallback | ✅ | Not Required |
|
||||
| GAP-W03 | Preferences Sync Constants | Low | ✅ | ✅ | Bypassed | Not Required |
|
||||
| ~~GAP-W04~~ | ~~OAuth Loopback Adapter~~ | ~~Removed~~ | — | — | — | — |
|
||||
|
||||
**Audit Notes (2026-01-05, updated):**
|
||||
- GAP-W01: ✅ **Complete** - UI now registers OIDC providers with backend and uses real `testOidcConnection` API
|
||||
- GAP-W02: Downgraded - intentional local-first fallback behavior, not a bug
|
||||
- GAP-W04: **Removed** - `use-oauth-flow.ts` directly invokes `initiate_oauth_loopback` (line 193); it works
|
||||
- GAP-W05: ✅ **Complete** - autoExtract enabled, progress events wired, backend ProcessingStatus + webhook events in place. Desktop verification required.
|
||||
|
||||
---
|
||||
|
||||
## GAP-W05: Post-Processing Orchestration (Critical)
|
||||
|
||||
> **Full Details:** See [sprint-gap-011-post-processing-pipeline/README.md](./sprint-gap-011-post-processing-pipeline/README.md)
|
||||
|
||||
### Description
|
||||
|
||||
After a meeting recording completes, the system fails to automatically trigger post-processing workflows (summarization, entity extraction, diarization refinement). Users see only raw recordings without automatic transcriptions, summaries, or extracted intelligence. **The architecture has all the RPC components but lacks orchestration to connect them.**
|
||||
|
||||
### The Disconnect
|
||||
|
||||
```
|
||||
┌────────────────────────────────────────────────────────────┐
|
||||
│ WHAT EXISTS (Individual RPCs - All Working) │
|
||||
├────────────────────────────────────────────────────────────┤
|
||||
│ GenerateSummary ─────────► Manual button click only │
|
||||
│ ExtractEntities ─────────► Auto-extract disabled │
|
||||
│ RefineSpeakerDiarization ► Manual button click only │
|
||||
└────────────────────────────────────────────────────────────┘
|
||||
|
||||
┌────────────────────────────────────────────────────────────┐
|
||||
│ REMAINING WORK (Orchestration Layer) │
|
||||
├────────────────────────────────────────────────────────────┤
|
||||
│ ✅ usePostProcessing hook created (30 tests) │
|
||||
│ ✅ ProcessingStatus component created (22 tests) │
|
||||
│ ✅ E2E tests created (14 tests) │
|
||||
│ ✅ Recording.tsx - flow acceptable (navigates to detail) │
|
||||
│ ✅ MeetingDetail.tsx - hook wired, component rendering │
|
||||
│ ⚠️ Auto-extract still disabled │
|
||||
│ ⚠️ Summary progress events still not wired │
|
||||
│ ⚠️ Backend ProcessingStatus not in proto/entity │
|
||||
└────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Implementation Status
|
||||
|
||||
| Layer | File | Status |
|
||||
|-------|------|--------|
|
||||
| **Python Backend** | `src/noteflow/grpc/_mixins/summarization.py` | ✅ RPC works |
|
||||
| **Python Backend** | `src/noteflow/grpc/_mixins/entities.py` | ✅ RPC works |
|
||||
| **Python Backend** | `src/noteflow/grpc/_mixins/diarization/_mixin.py` | ✅ RPC works |
|
||||
| **Rust gRPC Client** | `client/src-tauri/src/grpc/client/meetings.rs` | ✅ Wired |
|
||||
| **Rust Commands** | `client/src-tauri/src/commands/summary.rs` | ✅ Emits progress events |
|
||||
| **TS Adapter** | `client/src/api/tauri-adapter.ts` | ✅ All methods available |
|
||||
| **TS Hooks** | `client/src/hooks/use-entity-extraction.ts` | ⚠️ Auto-extract disabled |
|
||||
| **TS Hooks** | `client/src/hooks/use-diarization.ts` | ✅ Triggered via usePostProcessing |
|
||||
| **TS Pages** | `client/src/pages/Recording.tsx:319-351` | ✅ Flow acceptable (navigates to detail) |
|
||||
| **TS Pages** | `client/src/pages/MeetingDetail.tsx:40,85` | ✅ Hook and component wired |
|
||||
| **TS Orchestration** | `client/src/hooks/use-post-processing.ts` | ✅ Created (30 tests) |
|
||||
| **TS Progress** | `client/src/components/processing-status.tsx` | ✅ Created (22 tests) |
|
||||
| **E2E Tests** | `client/e2e/post-processing.spec.ts` | ✅ Created (14 tests) |
|
||||
|
||||
### Code Excerpts
|
||||
|
||||
**Recording.tsx immediately navigates away:**
|
||||
```typescript
|
||||
// client/src/pages/Recording.tsx:313-346
|
||||
const stopRecording = async () => {
|
||||
setIsRecording(false);
|
||||
streamRef.current?.close();
|
||||
|
||||
const stoppedMeeting = await api.stopMeeting(meeting.id);
|
||||
setMeeting(stoppedMeeting);
|
||||
|
||||
// GAP: Immediately navigates away - no processing triggered
|
||||
navigate(
|
||||
projectId
|
||||
? `/projects/${projectId}/meetings/${meeting.id}`
|
||||
: '/projects'
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
**MeetingDetail only fetches existing data:**
|
||||
```typescript
|
||||
// client/src/pages/MeetingDetail.tsx:79-98
|
||||
useEffect(() => {
|
||||
const loadMeeting = async () => {
|
||||
const data = await getAPI().getMeeting({
|
||||
meeting_id: id,
|
||||
include_segments: true,
|
||||
include_summary: true, // Fetches existing, doesn't generate
|
||||
});
|
||||
setMeeting(data.meeting);
|
||||
setSegments(data.segments || []);
|
||||
setSummary(data.summary); // null if not generated - stays null
|
||||
};
|
||||
|
||||
loadMeeting();
|
||||
}, [id]);
|
||||
```
|
||||
|
||||
**Auto-extract feature exists but is disabled:**
|
||||
```typescript
|
||||
// client/src/hooks/use-entity-extraction.ts:116-121
|
||||
useEffect(() => {
|
||||
if (autoExtract && meetingId && meetingState === 'completed') {
|
||||
extract(false); // This would work!
|
||||
}
|
||||
}, [autoExtract, meetingId, meetingState, extract]);
|
||||
|
||||
// client/src/pages/MeetingDetail.tsx:72-76
|
||||
const { extract: extractEntities } = useEntityExtraction({
|
||||
meetingId: id,
|
||||
meetingTitle: meeting?.title,
|
||||
meetingState: meeting?.state,
|
||||
// autoExtract: true <-- MISSING - feature disabled
|
||||
});
|
||||
```
|
||||
|
||||
**Summary progress events emitted but ignored:**
|
||||
```rust
|
||||
// client/src-tauri/src/commands/summary.rs:96-134
|
||||
tauri::async_runtime::spawn(async move {
|
||||
let mut interval = tokio::time::interval(Duration::from_secs(1));
|
||||
loop {
|
||||
interval.tick().await;
|
||||
let elapsed_s = start.elapsed().as_secs();
|
||||
|
||||
// Emits event but no React component listens
|
||||
emit_summary_progress(app.clone(), meeting_id.clone(), elapsed_s);
|
||||
|
||||
if elapsed_s >= 300 { break; }
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
### Blockers
|
||||
|
||||
1. ~~**No orchestration hook** (`usePostProcessing`) to coordinate processing after recording~~ ✅ Created
|
||||
2. ~~**Immediate navigation** in Recording.tsx prevents client-side orchestration~~ ✅ Flow acceptable
|
||||
3. **Auto-extract disabled** - simple fix but not done
|
||||
4. **No progress UI** - summary progress events ignored
|
||||
5. ~~**No ProcessingStatus tracking** on Meeting entity~~ ✅ Component created
|
||||
6. **Backend ProcessingStatus** - not in proto/entity (needed for persistent tracking)
|
||||
|
||||
### Required Changes
|
||||
|
||||
**Quick Wins (No new code):**
|
||||
1. Enable `autoExtract: true` in MeetingDetail.tsx entity extraction hook
|
||||
|
||||
**New Components:** ✅ COMPLETED
|
||||
1. ~~Create `client/src/hooks/use-post-processing.ts`~~ ✅ Done (30 unit tests)
|
||||
2. ~~Create `client/src/components/processing-status.tsx`~~ ✅ Done (22 unit tests)
|
||||
3. ~~Create E2E tests~~ ✅ Done (`e2e/post-processing.spec.ts` - 14 tests)
|
||||
|
||||
**Page Wiring:** ✅ COMPLETED
|
||||
1. ~~`client/src/pages/Recording.tsx`~~ ✅ Flow acceptable (navigates to detail)
|
||||
2. ~~`client/src/pages/MeetingDetail.tsx`~~ ✅ Hook and component wired (lines 40, 85, 413)
|
||||
3. ~~`client/src/hooks/use-diarization.ts`~~ ✅ Triggered via usePostProcessing
|
||||
|
||||
**Remaining Modifications:**
|
||||
1. Enable `autoExtract: true` in MeetingDetail.tsx
|
||||
2. Wire summary progress event listeners to `ProcessingStatus`
|
||||
3. Backend: Add `ProcessingStatus` to proto and Meeting entity
|
||||
|
||||
**Backend (Optional - for extended status tracking):**
|
||||
1. `src/noteflow/domain/entities/meeting.py` - Add `ProcessingStatus` dataclass
|
||||
2. `src/noteflow/grpc/proto/noteflow.proto` - Add `ProcessingStatus` message
|
||||
|
||||
### Desktop Testing Required
|
||||
|
||||
**Yes** - The full recording → stop → processing → display flow must be tested on the desktop app to verify:
|
||||
- Processing triggers after recording stops
|
||||
- Progress events display correctly
|
||||
- Failed processing steps don't block others
|
||||
- Navigation and state transitions work correctly
|
||||
|
||||
---
|
||||
|
||||
## GAP-W01: OIDC Provider Management (Sprint 17)
|
||||
|
||||
### Description
|
||||
|
||||
The OIDC provider management feature is fully implemented in the Python backend and **95% wired in the client**. The previous audit was outdated - client files now exist.
|
||||
|
||||
### Current Status (Updated 2026-01-05)
|
||||
|
||||
**✅ Client wiring complete:**
|
||||
- `client/src-tauri/src/grpc/client/oidc.rs` - ✅ EXISTS
|
||||
- `client/src-tauri/src/commands/oidc.rs` - ✅ EXISTS
|
||||
- `client/src/api/tauri-constants.ts` - ✅ 8 OIDC constants (lines 92-100)
|
||||
- `client/src/api/interface.ts` - ✅ 8 OIDC methods (lines 658-712)
|
||||
- `client/src/api/tauri-adapter.ts` - ✅ Full implementation (lines 902-951)
|
||||
- `client/src/api/mock-adapter.ts` - ✅ Mock implementation (lines 1290+)
|
||||
- `client/src/hooks/use-oidc-providers.ts` - ✅ EXISTS with tests
|
||||
|
||||
**⚠️ Remaining work:**
|
||||
- `client/src/components/settings/integration-config-panel.tsx` - Uses **fake test** instead of real OIDC API
|
||||
|
||||
### User Flow Trace
|
||||
|
||||
```
|
||||
User Action What Actually Happens
|
||||
─────────────────────────────────────────────────────────────────────────
|
||||
Settings → OIDC tab Shows generic Integration panel
|
||||
(integration-config-panel.tsx:660-810)
|
||||
|
||||
Configure issuer_url, client_id Updates local state only
|
||||
(onUpdate → preferences.updateIntegration)
|
||||
|
||||
Click "Test OIDC Connection" FAKE TEST:
|
||||
1. Sleep 1.5 seconds (line 205)
|
||||
2. Check if fields filled
|
||||
3. Update local preferences
|
||||
4. Show "Connection test passed" toast
|
||||
|
||||
NO gRPC call to backend
|
||||
NO actual OIDC discovery
|
||||
NO token validation
|
||||
|
||||
Save/Close Stored in localStorage only
|
||||
Backend has no knowledge of provider
|
||||
```
|
||||
|
||||
### Code Evidence
|
||||
|
||||
**The "test" is completely fake:**
|
||||
```typescript
|
||||
// client/src/components/settings/integrations-section.tsx:203-226
|
||||
const handleTestIntegration = async (integration: Integration) => {
|
||||
setTestingIntegration(integration.id);
|
||||
await new Promise((resolve) => setTimeout(resolve, 1500)); // Just sleeps!
|
||||
|
||||
if (hasRequiredIntegrationFields(integration)) {
|
||||
toast({
|
||||
title: 'Connection test passed', // Lies to user
|
||||
description: `${integration.name} is configured correctly`,
|
||||
});
|
||||
preferences.updateIntegration(integration.id, { status: 'connected' });
|
||||
}
|
||||
// NO gRPC call anywhere
|
||||
};
|
||||
```
|
||||
|
||||
### Proto RPCs (7 endpoints - Backend Only)
|
||||
|
||||
```protobuf
|
||||
// src/noteflow/grpc/proto/noteflow.proto:90-96
|
||||
rpc RegisterOidcProvider(RegisterOidcProviderRequest) returns (OidcProviderProto);
|
||||
rpc ListOidcProviders(ListOidcProvidersRequest) returns (ListOidcProvidersResponse);
|
||||
rpc GetOidcProvider(GetOidcProviderRequest) returns (OidcProviderProto);
|
||||
rpc UpdateOidcProvider(UpdateOidcProviderRequest) returns (OidcProviderProto);
|
||||
rpc DeleteOidcProvider(DeleteOidcProviderRequest) returns (DeleteOidcProviderResponse);
|
||||
rpc RefreshOidcDiscovery(RefreshOidcDiscoveryRequest) returns (RefreshOidcDiscoveryResponse);
|
||||
rpc ListOidcPresets(ListOidcPresetsRequest) returns (ListOidcPresetsResponse);
|
||||
```
|
||||
|
||||
### Implementation Status
|
||||
|
||||
| Layer | File | Status |
|
||||
|-------|------|--------|
|
||||
| **Python Backend** | `src/noteflow/grpc/_mixins/oidc.py` | ✅ Complete (7 RPCs) |
|
||||
| **Python Tests** | `tests/grpc/test_oidc_mixin.py` | ✅ 27 tests passing |
|
||||
| **Rust gRPC Client** | `client/src-tauri/src/grpc/client/oidc.rs` | ✅ Complete |
|
||||
| **Rust Commands** | `client/src-tauri/src/commands/oidc.rs` | ✅ Complete |
|
||||
| **TS Constants** | `client/src/api/tauri-constants.ts` | ✅ Lines 92-100 |
|
||||
| **TS Interface** | `client/src/api/interface.ts` | ✅ Lines 658-712 |
|
||||
| **TS Adapter** | `client/src/api/tauri-adapter.ts` | ✅ Lines 902-951 |
|
||||
| **Mock Adapter** | `client/src/api/mock-adapter.ts` | ✅ Lines 1290+ |
|
||||
| **TS Hooks** | `client/src/hooks/use-oidc-providers.ts` | ✅ Complete with tests |
|
||||
| **UI Component** | `client/src/components/settings/integration-config-panel.tsx` | ⚠️ Uses fake test |
|
||||
|
||||
### Impact
|
||||
|
||||
- ~~**User deception**: UI shows "Connection test passed" when nothing was tested~~ Still true for integration panel
|
||||
- ~~**No SSO functionality**: OIDC providers cannot actually authenticate users~~ API ready, UI not connected
|
||||
- ~~**Sprint doc inaccuracy**: Claimed deliverables don't exist~~ ✅ Files now exist
|
||||
|
||||
### Blockers (All Resolved Except UI)
|
||||
|
||||
1. ~~**No Rust gRPC wrapper module** for OIDC operations~~ ✅ Done
|
||||
2. ~~**No Tauri commands** to invoke OIDC RPCs~~ ✅ Done
|
||||
3. ~~**No TypeScript adapter methods** to call Tauri commands~~ ✅ Done
|
||||
4. **Generic Integration panel** still uses fake test instead of real OIDC API
|
||||
|
||||
### Required Changes
|
||||
|
||||
1. ~~Create `client/src-tauri/src/grpc/client/oidc.rs` wrapper~~ ✅ Done
|
||||
2. ~~Create `client/src-tauri/src/commands/oidc.rs` with Tauri commands~~ ✅ Done
|
||||
3. ~~Add OIDC constants to `client/src/api/tauri-constants.ts`~~ ✅ Done
|
||||
4. ~~Add OIDC methods to `NoteFlowAPI` interface and `tauri-adapter.ts`~~ ✅ Done
|
||||
5. ~~Create `client/src/hooks/use-oidc-providers.ts`~~ ✅ Done
|
||||
6. **Update `integration-config-panel.tsx`** to call real OIDC API methods instead of fake test
|
||||
|
||||
### Desktop Testing Required
|
||||
|
||||
**Yes** - OIDC involves OAuth redirects, token exchange, and discovery that must be tested end-to-end.
|
||||
|
||||
---
|
||||
|
||||
## GAP-W02: Workspace Management via gRPC (Informational)
|
||||
|
||||
### Description
|
||||
|
||||
The workspace management RPCs are implemented in Python. The Rust commands return local defaults instead of calling gRPC. **This is intentional local-first behavior.**
|
||||
|
||||
### User Flow Trace
|
||||
|
||||
```
|
||||
User Action What Actually Happens
|
||||
─────────────────────────────────────────────────────────────────────────
|
||||
App starts workspace-context.tsx:71-102
|
||||
├─ Calls api.getCurrentUser()
|
||||
├─ Calls api.listWorkspaces()
|
||||
└─ Falls back to local defaults if error
|
||||
|
||||
Click workspace dropdown workspace-switcher.tsx calls switchWorkspace()
|
||||
|
||||
Select different workspace workspace-context.tsx:108-133
|
||||
├─ Calls api.switchWorkspace(workspaceId)
|
||||
├─ Uses response.workspace if available
|
||||
├─ Falls back to local list match
|
||||
└─ Persists to localStorage
|
||||
|
||||
Backend unavailable ✅ Works - uses fallback workspace
|
||||
Backend available ✅ Works - uses backend response
|
||||
```
|
||||
|
||||
### Why This Is Not A Bug
|
||||
|
||||
The `workspace-context.tsx` implementation (lines 71-102, 108-133) demonstrates **graceful degradation**:
|
||||
|
||||
```typescript
|
||||
// workspace-context.tsx:81-82
|
||||
const availableWorkspaces =
|
||||
workspaceResponse.workspaces.length > 0 ? workspaceResponse.workspaces : [fallbackWorkspace];
|
||||
```
|
||||
|
||||
When the backend returns real workspaces, they're used. When unavailable, the app continues working with local defaults.
|
||||
|
||||
### Status: Design Decision Complete
|
||||
|
||||
| Behavior | Status |
|
||||
|----------|--------|
|
||||
| Local-first operation | ✅ Implemented |
|
||||
| Backend integration | ⚠️ Optional - could add gRPC-first with fallback |
|
||||
| User experience | ✅ Seamless - works offline and online |
|
||||
|
||||
### Recommendation
|
||||
|
||||
No action required unless multi-user workspace sync is needed. The current implementation correctly prioritizes local-first operation while still accepting backend responses when available.
|
||||
|
||||
### Desktop Testing Required
|
||||
|
||||
**No** - behavior is intentional and works correctly.
|
||||
|
||||
---
|
||||
|
||||
## GAP-W03: Preferences Sync Constants
|
||||
|
||||
### Description
|
||||
|
||||
The preferences sync commands exist in Rust but are not registered in `TauriConstants` or the `NoteFlowAPI` interface. They are called directly via `invoke()` in `preferences-sync.ts`.
|
||||
|
||||
### Tauri Commands
|
||||
|
||||
```rust
|
||||
// client/src-tauri/src/commands/preferences.rs:40-66
|
||||
#[tauri::command(rename_all = "snake_case")]
|
||||
pub async fn get_preferences_sync(/* ... */) -> Result<PreferencesSyncResult> { /* ... */ }
|
||||
|
||||
#[tauri::command(rename_all = "snake_case")]
|
||||
pub async fn set_preferences_sync(/* ... */) -> Result<SetPreferencesResult> { /* ... */ }
|
||||
```
|
||||
|
||||
### Usage (Bypasses Adapter)
|
||||
|
||||
```typescript
|
||||
// client/src/lib/preferences-sync.ts:148-151
|
||||
const response = await invoke<GetPreferencesResult>('get_preferences_sync', {
|
||||
keys: keys ?? [],
|
||||
});
|
||||
|
||||
// client/src/lib/preferences-sync.ts:200-205
|
||||
const response = await invoke<SetPreferencesResult>('set_preferences_sync', {
|
||||
preferences: encoded,
|
||||
if_match: options?.force ? null : meta.etag,
|
||||
// ...
|
||||
});
|
||||
```
|
||||
|
||||
### Assessment
|
||||
|
||||
**Low severity** - Functionality works, but pattern is inconsistent with the rest of the codebase.
|
||||
|
||||
### Recommendation
|
||||
|
||||
Add to `TauriConstants` for consistency:
|
||||
```typescript
|
||||
GET_PREFERENCES_SYNC: 'get_preferences_sync',
|
||||
SET_PREFERENCES_SYNC: 'set_preferences_sync',
|
||||
```
|
||||
|
||||
### Desktop Testing Required
|
||||
|
||||
Optional - basic functionality already works.
|
||||
|
||||
---
|
||||
|
||||
## ~~GAP-W04: OAuth Loopback Adapter Method~~ (Resolved)
|
||||
|
||||
### Status: **REMOVED** - Works correctly
|
||||
|
||||
### User Flow Trace
|
||||
|
||||
```
|
||||
User Action What Actually Happens
|
||||
─────────────────────────────────────────────────────────────────────────
|
||||
Settings → Calendar → Connect integrations-section.tsx:131
|
||||
└─ Calls initiateAuth(provider)
|
||||
|
||||
initiateAuth runs use-oauth-flow.ts:163-243
|
||||
├─ Checks isTauriEnvironment()
|
||||
├─ If Tauri: invoke('initiate_oauth_loopback')
|
||||
│ (lines 189-193)
|
||||
└─ If browser: window.open(authUrl)
|
||||
|
||||
Rust handles OAuth calendar.rs:73-77
|
||||
├─ Starts local HTTP server
|
||||
├─ Opens browser with OAuth URL
|
||||
├─ Waits for callback
|
||||
└─ Completes OAuth with server
|
||||
|
||||
Success use-oauth-flow.ts:197-208
|
||||
├─ Fetches connection status
|
||||
└─ Updates state to 'connected'
|
||||
```
|
||||
|
||||
### Why This Was Incorrectly Flagged
|
||||
|
||||
The original analysis only checked if the method was in `tauri-adapter.ts`. It is not - but `use-oauth-flow.ts` **directly invokes** the Tauri command:
|
||||
|
||||
```typescript
|
||||
// use-oauth-flow.ts:189-193
|
||||
const result = await invoke<{
|
||||
success: boolean;
|
||||
integration_id: string | null;
|
||||
error_message: string | null;
|
||||
}>('initiate_oauth_loopback', { provider });
|
||||
```
|
||||
|
||||
This pattern (direct `invoke()` bypassing the adapter) is used intentionally for specialized flows. The feature **works correctly**.
|
||||
|
||||
### Lesson Learned
|
||||
|
||||
Checking if a method exists in an adapter is insufficient. Must trace actual user flows to verify functionality.
|
||||
|
||||
### Test Suite Update (2026-01-05)
|
||||
|
||||
Fixed 4 failing tests in `use-oauth-flow.test.ts`:
|
||||
- Hook now correctly passes `redirectUri` parameter to API
|
||||
- Replaced deep-link tests with Tauri loopback tests (deep links aren't used with loopback flow)
|
||||
- All 22 oauth-flow tests now pass
|
||||
|
||||
---
|
||||
|
||||
## Verification Checklist
|
||||
|
||||
### For GAP-W05 (Post-Processing Orchestration) Implementation
|
||||
|
||||
**Quick Win:**
|
||||
- [ ] Enable `autoExtract: true` in MeetingDetail.tsx
|
||||
|
||||
**Client Components:** ✅ COMPLETED
|
||||
- [x] Create `client/src/hooks/use-post-processing.ts`
|
||||
- [x] Create `client/src/components/processing-status.tsx`
|
||||
- [ ] Add summary progress event listener
|
||||
- [x] Add auto-diarization trigger to `use-diarization.ts` (via usePostProcessing)
|
||||
- [x] Wire `usePostProcessing` into MeetingDetail.tsx (lines 40, 85)
|
||||
- [x] Update Recording.tsx to handle processing transitions (flow acceptable)
|
||||
|
||||
**Backend (Required):** ✅ COMPLETED
|
||||
- [x] Add `ProcessingStatus` dataclass to Meeting entity (lines 38-148)
|
||||
- [x] Add `ProcessingStatus` message to proto (lines 595-628)
|
||||
- [x] Regenerate proto stubs
|
||||
- [x] Update GetMeeting to include processing status (line 272)
|
||||
- [x] Add `ENTITIES_EXTRACTED` webhook event (domain/webhooks/events.py:54)
|
||||
- [x] Add `DIARIZATION_COMPLETED` webhook event (domain/webhooks/events.py:55)
|
||||
|
||||
**Testing:** ✅ COMPLETED
|
||||
- [x] Unit test: `usePostProcessing` hook state transitions (30 tests)
|
||||
- [x] Unit test: `ProcessingStatus` component rendering (22 tests)
|
||||
- [x] E2E test: Post-processing flow (14 tests)
|
||||
- [ ] Desktop E2E: Recording → Processing → Complete flow
|
||||
|
||||
---
|
||||
|
||||
### For GAP-W01 (OIDC) Implementation
|
||||
|
||||
- [x] Create Rust gRPC client module (`grpc/client/oidc.rs`)
|
||||
- [x] Add module to `grpc/client/mod.rs`
|
||||
- [x] Create Rust command handlers (`commands/oidc.rs`)
|
||||
- [x] Register commands in Tauri
|
||||
- [x] Add types to `grpc/types/`
|
||||
- [x] Add constants to `tauri-constants.ts`
|
||||
- [x] Add methods to `interface.ts`
|
||||
- [x] Implement in `tauri-adapter.ts`
|
||||
- [x] Add mock implementation in `mock-adapter.ts`
|
||||
- [ ] **Update `integration-config-panel.tsx` to use real OIDC API**
|
||||
- [ ] Test OIDC flow on desktop app
|
||||
|
||||
### General Wiring Checklist
|
||||
|
||||
When adding a new gRPC RPC:
|
||||
|
||||
1. **Proto** → `noteflow.proto` + regenerate stubs
|
||||
2. **Python** → Create/update mixin in `_mixins/`
|
||||
3. **Rust gRPC Client** → Add method to appropriate module in `grpc/client/`
|
||||
4. **Rust Types** → Add types to `grpc/types/`
|
||||
5. **Rust Commands** → Add Tauri command in `commands/`
|
||||
6. **TS Constants** → Add command name to `tauri-constants.ts`
|
||||
7. **TS Types** → Add request/response types to `api/types/`
|
||||
8. **TS Interface** → Add method signature to `interface.ts`
|
||||
9. **TS Adapter** → Implement in `tauri-adapter.ts`
|
||||
10. **Mock Adapter** → Add mock implementation for browser dev
|
||||
11. **Cached Adapter** → Add offline implementation or explicitly reject
|
||||
12. **Tests** → Add tests at each layer
|
||||
172
docs/sprints/phase-ongoing/sprint-26-error-hygiene/README.md
Normal file
172
docs/sprints/phase-ongoing/sprint-26-error-hygiene/README.md
Normal file
@@ -0,0 +1,172 @@
|
||||
# Sprint 26: Error + Logging Hygiene
|
||||
|
||||
> **Size**: M | **Owner**: Full-stack | **Prerequisites**: None
|
||||
> **Phase**: Ongoing
|
||||
|
||||
---
|
||||
|
||||
## Objective
|
||||
|
||||
Reduce silent failures and improve observability by routing non-critical errors through centralized logging, removing duplicated storage/error-handling patterns, and hardening Tauri background runtimes against panic-only failures.
|
||||
|
||||
**Callout: No new compatibility layers or wrappers**
|
||||
- Do not add new helper wrappers, shims, or compatibility modules.
|
||||
- Prefer direct edits in existing files and remove duplicated patterns instead of abstracting them.
|
||||
- Keep changes minimal, explicit, and localized.
|
||||
|
||||
---
|
||||
|
||||
## Findings (2026-01-09)
|
||||
|
||||
### High
|
||||
|
||||
1) **Tauri background runtime creation panics instead of reporting failure**
|
||||
- **Impact**: If the runtime fails to build, the thread panics and the polling/emitter silently stops, leaving triggers/events dead without a surfaced error.
|
||||
- **Excerpt**:
|
||||
```rust
|
||||
let rt = tokio::runtime::Builder::new_current_thread()
|
||||
.enable_all()
|
||||
.build()
|
||||
.expect("Failed to create trigger polling runtime");
|
||||
```
|
||||
- **Locations**:
|
||||
- `client/src-tauri/src/commands/triggers/polling.rs:108-113` (runtime build uses `expect`)
|
||||
- `client/src-tauri/src/events/mod.rs:297-302` (runtime build uses `expect`)
|
||||
- `client/src-tauri/src/lib.rs:330-340` (app run uses `expect`)
|
||||
|
||||
### Medium
|
||||
|
||||
2) **Silent/quiet failures in diarization compatibility patches**
|
||||
- **Impact**: Optional dependency import/patch failures are swallowed without any trace, making it hard to understand why diarization patches weren’t applied.
|
||||
- **Excerpt**:
|
||||
```python
|
||||
def _load_pyannote_task_module() -> ModuleType | None:
|
||||
try:
|
||||
return importlib.import_module("pyannote.audio.core.task")
|
||||
except Exception:
|
||||
return None
|
||||
```
|
||||
- **Locations**:
|
||||
- `src/noteflow/infrastructure/diarization/_compat.py:63-67` (pyannote task import fails silently)
|
||||
- `src/noteflow/infrastructure/diarization/_compat.py:128-132` (huggingface file_download import fails silently)
|
||||
- `src/noteflow/infrastructure/diarization/_compat.py:146-155` (torchaudio import failure ignored)
|
||||
|
||||
3) **Client catch blocks swallow errors with no logging or telemetry**
|
||||
- **Impact**: Failure to load Tauri paths, secure storage, or secrets is invisible to log buffers and diagnostics.
|
||||
- **Excerpt**:
|
||||
```tsx
|
||||
try {
|
||||
await setSecureValue(`${configType}_api_key`, key);
|
||||
} catch {
|
||||
// Error swallowed intentionally - secure storage write failure is non-critical
|
||||
}
|
||||
```
|
||||
- **Locations**:
|
||||
- `client/src/components/settings/export-ai-section.tsx:185-198`
|
||||
- `client/src/components/settings/ai-config-section.tsx:142-147`
|
||||
- `client/src/hooks/use-secure-integration-secrets.ts:37-46`
|
||||
|
||||
4) **Non-centralized client logging (console.warn/error bypasses log buffer)**
|
||||
- **Impact**: Important operational warnings don’t appear in `client-logs` or diagnostics UI.
|
||||
- **Excerpt**:
|
||||
```ts
|
||||
console.warn('[Reconnection] Integration revalidation failed:', error);
|
||||
```
|
||||
- **Locations**:
|
||||
- `client/src/api/reconnection.ts:54-79` (reconnect diagnostics)
|
||||
- `client/src/hooks/use-panel-preferences.ts:105-128` (storage issues)
|
||||
- `client/src/hooks/use-meeting-reminders.ts:41-96` (storage issues)
|
||||
|
||||
### Low
|
||||
|
||||
5) **Duplicated localStorage wrappers despite shared utilities**
|
||||
- **Impact**: Multiple ad-hoc localStorage read/write paths duplicate validation/logging patterns; increases drift risk and bypasses `lib/storage-utils.ts`.
|
||||
- **Excerpt**:
|
||||
```ts
|
||||
try {
|
||||
localStorage.setItem(STORAGE_KEY, JSON.stringify(settings));
|
||||
} catch (error) {
|
||||
console.warn('[MeetingReminders] saveSettings failed - settings won\'t persist:', error);
|
||||
}
|
||||
```
|
||||
- **Locations**:
|
||||
- `client/src/hooks/use-panel-preferences.ts:103-128`
|
||||
- `client/src/hooks/use-meeting-reminders.ts:41-96`
|
||||
- `client/src/lib/storage-utils.ts:11-72` (centralized wrapper exists but not used in these hooks)
|
||||
|
||||
---
|
||||
|
||||
## Scope
|
||||
|
||||
| Task | Effort | Owner | Notes |
|
||||
|------|--------|-------|-------|
|
||||
| Replace `expect` in Tauri background runtime creation with error reporting + graceful disable | M | Rust | Log via `tracing::error`, optionally emit UI error event |
|
||||
| Add debug-level logging for optional diarization patch failures | S | Backend | Log once per session to avoid noise |
|
||||
| Convert silent `catch {}` blocks to `catch (error)` + `addClientLog` or `toastError` as appropriate | M | Client | Keep failures non-blocking but visible, no new wrapper helper |
|
||||
| Centralize localStorage access in hooks via `lib/storage-utils.ts` | S | Client | Align with code-quality expectations |
|
||||
|
||||
---
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### Task 1: Harden Tauri background runtimes
|
||||
- Replace `.expect(...)` with explicit error handling and `tracing::error!` logging.
|
||||
- If the runtime cannot be created, disable the loop and emit a UI error event via existing `AppEvent::Error` (no new wrappers).
|
||||
- Ensure any early return logs include enough context to trace the failed subsystem.
|
||||
|
||||
### Task 2: Make diarization compatibility failures visible
|
||||
- Add `logger.debug` or `logger.info` on optional import failures (`pyannote`, `huggingface_hub`, `torchaudio`).
|
||||
- Use a once-only guard to avoid repeated logging for missing optional dependencies.
|
||||
- Do not create a new compatibility module; only edit the existing `_compat.py`.
|
||||
|
||||
### Task 3: Centralize frontend error capture
|
||||
- Replace console warnings in reconnection and storage flows with direct `addClientLog` usage.
|
||||
- Prefer `toastError` only for user-facing flows (settings panels, destructive actions).
|
||||
- Do not add a new logging helper wrapper.
|
||||
|
||||
### Task 4: Deduplicate localStorage access
|
||||
- Convert `use-panel-preferences` and `use-meeting-reminders` to use `readJson`, `writeJson` from `lib/storage-utils.ts`.
|
||||
- Keep validation logic in the hook; only storage access moves to centralized utilities.
|
||||
- Ensure no new localStorage wrappers remain outside the allowed files list.
|
||||
|
||||
---
|
||||
|
||||
## Execution Guardrails
|
||||
|
||||
**No duplicates**
|
||||
- If a pattern already exists (e.g., `addClientLog`), use it directly instead of creating a parallel helper.
|
||||
- Remove the old ad-hoc path once the centralized path is in place.
|
||||
|
||||
**Clean code**
|
||||
- Avoid nested try/catch blocks.
|
||||
- Prefer explicit error messages with consistent metadata keys.
|
||||
- Do not add new compatibility layers or wrapper utilities.
|
||||
|
||||
**Quality gates (must pass)**
|
||||
- `make quality`
|
||||
- `pytest tests/quality/` (required after non-trivial changes)
|
||||
|
||||
---
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- No `expect(...)` remains in non-test Tauri runtime setup paths; failures are logged and surfaced.
|
||||
- Optional dependency patch failures in diarization emit a traceable log entry.
|
||||
- All non-critical client warnings/errors are captured in client logs instead of `console.warn/error`.
|
||||
- Hooks no longer implement their own localStorage read/write try/catch blocks.
|
||||
- No new compatibility layers or wrapper helpers were added.
|
||||
- All quality checks pass (`make quality`, `pytest tests/quality/`).
|
||||
|
||||
---
|
||||
|
||||
## Out of Scope
|
||||
|
||||
- Refactoring backend error taxonomy or gRPC error mappings.
|
||||
- Replacing intentional fire-and-forget error suppression where logging already exists.
|
||||
|
||||
---
|
||||
|
||||
## Risks / Notes
|
||||
|
||||
- Some silent failures are intentionally non-fatal; the goal is visibility, not strict error propagation.
|
||||
- Client-side logging should avoid leaking secrets (sanitize error messages before logging).
|
||||
@@ -59,14 +59,14 @@ class OidcMixin:
|
||||
preset = parse_preset(request.preset) if request.preset else OidcProviderPreset.CUSTOM
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, ERR_INVALID_PRESET)
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
# Parse workspace ID
|
||||
try:
|
||||
workspace_id = UUID(request.workspace_id) if request.workspace_id else UUID(int=0)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, ERROR_INVALID_WORKSPACE_ID_FORMAT)
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
custom_config = parse_register_options(request)
|
||||
|
||||
@@ -129,14 +129,14 @@ class OidcMixin:
|
||||
provider_id = parse_provider_id(request.provider_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, ERR_INVALID_PROVIDER_ID)
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
oidc_service = self.get_oidc_service()
|
||||
provider = oidc_service.registry.get_provider(provider_id)
|
||||
|
||||
if provider is None:
|
||||
await abort_not_found(context, ENTITY_OIDC_PROVIDER, str(provider_id))
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
return oidc_provider_to_proto(provider)
|
||||
|
||||
@@ -152,14 +152,14 @@ class OidcMixin:
|
||||
provider_id = parse_provider_id(request.provider_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, ERR_INVALID_PROVIDER_ID)
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
oidc_service = self.get_oidc_service()
|
||||
provider = oidc_service.registry.get_provider(provider_id)
|
||||
|
||||
if provider is None:
|
||||
await abort_not_found(context, ENTITY_OIDC_PROVIDER, str(provider_id))
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
apply_update_request_to_provider(provider, request)
|
||||
return oidc_provider_to_proto(provider)
|
||||
@@ -176,14 +176,14 @@ class OidcMixin:
|
||||
provider_id = parse_provider_id(request.provider_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, ERR_INVALID_PROVIDER_ID)
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
oidc_service = self.get_oidc_service()
|
||||
success = oidc_service.registry.remove_provider(provider_id)
|
||||
|
||||
if not success:
|
||||
await abort_not_found(context, ENTITY_OIDC_PROVIDER, str(provider_id))
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
return noteflow_pb2.DeleteOidcProviderResponse(success=success)
|
||||
|
||||
@@ -201,7 +201,7 @@ class OidcMixin:
|
||||
provider_id = parse_provider_id(request.provider_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, ERR_INVALID_PROVIDER_ID)
|
||||
raise AssertionError("unreachable") # abort is NoReturn
|
||||
raise AssertionError("unreachable") from None # abort is NoReturn
|
||||
|
||||
return await refresh_single_provider(oidc_service, provider_id, context)
|
||||
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import AsyncIterator
|
||||
from dataclasses import dataclass
|
||||
from typing import TYPE_CHECKING, Protocol, cast
|
||||
|
||||
import numpy as np
|
||||
@@ -57,9 +58,6 @@ class _AsrResultLike(Protocol):
|
||||
def end(self) -> float: ...
|
||||
|
||||
|
||||
from dataclasses import dataclass
|
||||
|
||||
|
||||
@dataclass(frozen=True, slots=True)
|
||||
class _SegmentBuildContext:
|
||||
"""Context for building segments from ASR results.
|
||||
|
||||
@@ -56,12 +56,12 @@ async def parse_template_id(
|
||||
"""Parse and validate template ID."""
|
||||
if not template_id:
|
||||
await abort_invalid_argument(context, f"{field_name}{ERROR_REQUIRED_SUFFIX}")
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
try:
|
||||
return UUID(template_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, f"{ERROR_INVALID_PREFIX}{field_name}")
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
|
||||
async def list_summarization_templates(
|
||||
@@ -74,12 +74,12 @@ async def list_summarization_templates(
|
||||
context = ctx.context
|
||||
if not request.workspace_id:
|
||||
await abort_invalid_argument(context, ERROR_WORKSPACE_ID_REQUIRED)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
workspace_id = await parse_workspace_id(request.workspace_id, context)
|
||||
if workspace_id != op_context.workspace_id:
|
||||
await abort_permission_denied(context, ERROR_WORKSPACE_SCOPE_MISMATCH)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
limit = request.limit if request.limit > 0 else 50
|
||||
offset = max(request.offset, 0)
|
||||
@@ -113,10 +113,10 @@ async def get_summarization_template(
|
||||
template = await ctx.template_service.get_template(repo, template_id)
|
||||
if template is None:
|
||||
await abort_not_found(context, ENTITY_SUMMARIZATION_TEMPLATE, request.template_id)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
if template.workspace_id is not None and template.workspace_id != op_context.workspace_id:
|
||||
await abort_permission_denied(context, ERROR_TEMPLATE_NOT_AVAILABLE)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
if request.include_current_version and template.current_version_id is not None:
|
||||
version = await ctx.template_service.get_current_version(repo, template)
|
||||
@@ -140,7 +140,7 @@ async def create_summarization_template(
|
||||
context = ctx.context
|
||||
if not request.workspace_id:
|
||||
await abort_invalid_argument(context, ERROR_WORKSPACE_ID_REQUIRED)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
workspace_id = await parse_workspace_id(request.workspace_id, context)
|
||||
description = request.description if request.HasField("description") else None
|
||||
@@ -157,10 +157,10 @@ async def create_summarization_template(
|
||||
result = await ctx.template_service.create_template(repo, op_context, payload)
|
||||
except PermissionError as exc:
|
||||
await abort_permission_denied(context, str(exc))
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
except ValueError as exc:
|
||||
await abort_invalid_argument(context, str(exc))
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
if result.version is not None:
|
||||
return noteflow_pb2.SummarizationTemplateMutationResponse(
|
||||
@@ -198,14 +198,14 @@ async def update_summarization_template(
|
||||
result = await ctx.template_service.update_template(repo, op_context, payload)
|
||||
except PermissionError as exc:
|
||||
await abort_permission_denied(context, str(exc))
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
except ValueError as exc:
|
||||
await abort_invalid_argument(context, str(exc))
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
if result is None:
|
||||
await abort_not_found(context, ENTITY_SUMMARIZATION_TEMPLATE, request.template_id)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
if result.version is not None:
|
||||
return noteflow_pb2.SummarizationTemplateMutationResponse(
|
||||
@@ -235,16 +235,16 @@ async def archive_summarization_template(
|
||||
)
|
||||
except PermissionError as exc:
|
||||
await abort_permission_denied(context, str(exc))
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
if not archived:
|
||||
await abort_not_found(context, ENTITY_SUMMARIZATION_TEMPLATE, request.template_id)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
template = await repo.summarization_templates.get(template_id)
|
||||
if template is None:
|
||||
await abort_not_found(context, ENTITY_SUMMARIZATION_TEMPLATE, request.template_id)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
return summarization_template_to_proto(template)
|
||||
|
||||
@@ -265,10 +265,10 @@ async def list_summarization_template_versions(
|
||||
template = await ctx.template_service.get_template(repo, template_id)
|
||||
if template is None:
|
||||
await abort_not_found(context, ENTITY_SUMMARIZATION_TEMPLATE, request.template_id)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
if template.workspace_id is not None and template.workspace_id != op_context.workspace_id:
|
||||
await abort_permission_denied(context, ERROR_TEMPLATE_NOT_AVAILABLE)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
versions = await ctx.template_service.list_versions(repo, template_id)
|
||||
total_count = len(versions)
|
||||
@@ -304,13 +304,13 @@ async def restore_summarization_template_version(
|
||||
)
|
||||
except PermissionError as exc:
|
||||
await abort_permission_denied(context, str(exc))
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
except ValueError as exc:
|
||||
await abort_invalid_argument(context, str(exc))
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
if template is None:
|
||||
await abort_not_found(context, ENTITY_SUMMARIZATION_TEMPLATE, request.template_id)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
return summarization_template_to_proto(template)
|
||||
|
||||
@@ -120,7 +120,7 @@ class TemplatePromptResolver:
|
||||
async def _parse_requested_template_id(self, template_id: str) -> UUID:
|
||||
if not self._repo.supports_workspaces:
|
||||
await require_feature_workspaces(self._repo, self._context)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
try:
|
||||
return UUID(template_id)
|
||||
except ValueError:
|
||||
@@ -128,7 +128,7 @@ class TemplatePromptResolver:
|
||||
self._context,
|
||||
f"{ERROR_INVALID_PREFIX}{RULE_FIELD_TEMPLATE_ID}",
|
||||
)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
async def _resolve_default_template_id(self) -> UUID | None:
|
||||
if not self._repo.supports_workspaces:
|
||||
@@ -151,7 +151,7 @@ class TemplatePromptResolver:
|
||||
self._context,
|
||||
"Workspace default summarization template is invalid",
|
||||
)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
async def _load_project_workspace(self) -> tuple[Project | None, Workspace | None]:
|
||||
if self._project is not None or self._workspace is not None:
|
||||
@@ -192,16 +192,16 @@ class TemplatePromptResolver:
|
||||
ENTITY_SUMMARIZATION_TEMPLATE,
|
||||
str(template_id),
|
||||
)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
if template.is_archived:
|
||||
await abort_failed_precondition(self._context, "Summarization template is archived")
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
if template.workspace_id is not None and template.workspace_id != self._op_context.workspace_id:
|
||||
await abort_permission_denied(self._context, ERROR_TEMPLATE_NOT_AVAILABLE)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
if template.current_version_id is None:
|
||||
await abort_failed_precondition(self._context, "Template has no current version")
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
|
||||
version = await self._repo.summarization_templates.get_version(
|
||||
template.current_version_id
|
||||
@@ -212,7 +212,7 @@ class TemplatePromptResolver:
|
||||
"SummarizationTemplateVersion",
|
||||
str(template.current_version_id),
|
||||
)
|
||||
raise RuntimeError(UNREACHABLE_ERROR)
|
||||
raise RuntimeError(UNREACHABLE_ERROR) from None
|
||||
return template, version
|
||||
|
||||
async def _build_context_payload(self) -> TemplateContext:
|
||||
|
||||
@@ -9,6 +9,7 @@ import asyncio
|
||||
from collections.abc import Iterable, Iterator
|
||||
from typing import TYPE_CHECKING, Final, Protocol, TypedDict, Unpack, cast
|
||||
|
||||
from noteflow.infrastructure.asr.dto import AsrResult, WordTiming
|
||||
from noteflow.infrastructure.logging import get_logger, log_timing
|
||||
|
||||
if TYPE_CHECKING:
|
||||
@@ -53,7 +54,6 @@ class _WhisperModel(Protocol):
|
||||
**kwargs: Unpack[_WhisperTranscribeKwargs],
|
||||
) -> tuple[Iterable[_WhisperSegment], _WhisperInfo]: ...
|
||||
|
||||
from noteflow.infrastructure.asr.dto import AsrResult, WordTiming
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
@@ -103,8 +103,9 @@ def extract_meeting_url(item: OutlookEvent) -> str | None:
|
||||
if online_url := item.get("onlineMeetingUrl"):
|
||||
return online_url
|
||||
|
||||
if online_meeting := item.get("onlineMeeting"):
|
||||
if join_url := online_meeting.get("joinUrl"):
|
||||
return join_url
|
||||
if (online_meeting := item.get("onlineMeeting")) and (
|
||||
join_url := online_meeting.get("joinUrl")
|
||||
):
|
||||
return join_url
|
||||
|
||||
return None
|
||||
|
||||
@@ -11,6 +11,7 @@ from dataclasses import dataclass, field
|
||||
from typing import TYPE_CHECKING, Protocol
|
||||
|
||||
import numpy as np
|
||||
from numpy.typing import NDArray
|
||||
|
||||
from noteflow.config.constants import DEFAULT_SAMPLE_RATE
|
||||
from noteflow.infrastructure.diarization.dto import SpeakerTurn
|
||||
@@ -32,7 +33,6 @@ class _Annotation(Protocol):
|
||||
yield_label: bool,
|
||||
) -> Sequence[tuple[_TrackSegment, object, object]]: ...
|
||||
|
||||
from numpy.typing import NDArray
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
@@ -10,6 +10,8 @@ from collections.abc import Sequence
|
||||
from typing import TYPE_CHECKING, Unpack
|
||||
from uuid import UUID
|
||||
|
||||
from noteflow.domain.ports.repositories.identity._workspace import WorkspaceCreateKwargs
|
||||
|
||||
_ERR_USERS_DB = "Users require database persistence"
|
||||
_ERR_WORKSPACES_DB = "Workspaces require database persistence"
|
||||
|
||||
@@ -20,7 +22,6 @@ if TYPE_CHECKING:
|
||||
WorkspaceMembership,
|
||||
WorkspaceRole,
|
||||
)
|
||||
from noteflow.domain.ports.repositories.identity._workspace import WorkspaceCreateKwargs
|
||||
|
||||
|
||||
class UnsupportedUserRepository:
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
from uuid import uuid4
|
||||
|
||||
@@ -29,7 +29,7 @@ class AnnotationModel(CreatedAtMixin, Base):
|
||||
"""
|
||||
|
||||
__tablename__ = "annotations"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
|
||||
annotation_id: Mapped[PyUUID] = mapped_column(
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import DateTime, Float, Integer, String, Text
|
||||
@@ -32,7 +32,7 @@ class DiarizationJobModel(CreatedAtMixin, UpdatedAtMixin, Base):
|
||||
"""
|
||||
|
||||
__tablename__ = TABLE_DIARIZATION_JOBS
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
id: Mapped[str] = mapped_column(String(36), primary_key=True)
|
||||
meeting_id: Mapped[PyUUID] = meeting_id_fk_column(index=True)
|
||||
@@ -68,7 +68,7 @@ class StreamingDiarizationTurnModel(CreatedAtMixin, Base):
|
||||
"""
|
||||
|
||||
__tablename__ = "streaming_diarization_turns"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
|
||||
meeting_id: Mapped[PyUUID] = meeting_id_fk_column(index=True)
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from pgvector.sqlalchemy import Vector
|
||||
@@ -75,7 +75,7 @@ class MeetingModel(UuidPrimaryKeyMixin, CreatedAtMixin, MetadataMixin, Base):
|
||||
"""Represent a meeting recording session."""
|
||||
|
||||
__tablename__ = "meetings"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
# Forward-looking tenancy fields with safe defaults for current single-user mode
|
||||
workspace_id: Mapped[PyUUID] = mapped_column(
|
||||
@@ -195,7 +195,7 @@ class SegmentModel(Base):
|
||||
"""Represent a transcript segment within a meeting."""
|
||||
|
||||
__tablename__ = "segments"
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint("meeting_id", "segment_id", name="segments_unique_per_meeting"),
|
||||
{"schema": "noteflow"},
|
||||
)
|
||||
@@ -235,7 +235,7 @@ class WordTimingModel(Base):
|
||||
"""Represent word-level timing within a segment."""
|
||||
|
||||
__tablename__ = "word_timings"
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint("segment_pk", "word_index", name="word_timings_unique_per_segment"),
|
||||
{"schema": "noteflow"},
|
||||
)
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import DateTime, Float, ForeignKey, Integer, Text, UniqueConstraint
|
||||
@@ -36,7 +36,7 @@ class SummaryModel(Base):
|
||||
"""Represent an LLM-generated meeting summary."""
|
||||
|
||||
__tablename__ = "summaries"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
id: Mapped[int] = mapped_column(Integer, primary_key=True, autoincrement=True)
|
||||
meeting_id: Mapped[PyUUID] = meeting_id_fk_column(unique=True)
|
||||
@@ -73,7 +73,7 @@ class KeyPointModel(Base):
|
||||
"""Represent an extracted key point from a summary."""
|
||||
|
||||
__tablename__ = TABLE_KEY_POINTS
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint("summary_id", "position", name="key_points_unique_position"),
|
||||
{"schema": "noteflow"},
|
||||
)
|
||||
@@ -105,7 +105,7 @@ class ActionItemModel(Base):
|
||||
"""Represent an extracted action item from a summary."""
|
||||
|
||||
__tablename__ = TABLE_ACTION_ITEMS
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint("summary_id", "position", name="action_items_unique_position"),
|
||||
{"schema": "noteflow"},
|
||||
)
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import ForeignKey, String, Text, UniqueConstraint
|
||||
@@ -41,7 +41,7 @@ class PersonModel(UuidPrimaryKeyMixin, CreatedAtMixin, UpdatedAtMixin, MetadataM
|
||||
"""
|
||||
|
||||
__tablename__ = "persons"
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint("workspace_id", EMAIL, name="persons_unique_email_per_workspace"),
|
||||
{"schema": "noteflow"},
|
||||
)
|
||||
@@ -69,7 +69,7 @@ class MeetingSpeakerModel(CreatedAtMixin, Base):
|
||||
"""Map speaker labels to display names and persons within a meeting."""
|
||||
|
||||
__tablename__ = "meeting_speakers"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
meeting_id: Mapped[PyUUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import DateTime, ForeignKey, String, Text, UniqueConstraint
|
||||
@@ -43,7 +43,7 @@ class WorkspaceModel(CreatedAtMixin, UpdatedAtMixin, MetadataMixin, Base):
|
||||
"""Represent a workspace for multi-tenant support."""
|
||||
|
||||
__tablename__ = "workspaces"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
id: Mapped[PyUUID] = mapped_column(UUID(as_uuid=True), primary_key=True)
|
||||
slug: Mapped[str | None] = mapped_column(Text, unique=True, nullable=True)
|
||||
@@ -95,7 +95,7 @@ class UserModel(CreatedAtMixin, UpdatedAtMixin, MetadataMixin, Base):
|
||||
"""Represent a user account."""
|
||||
|
||||
__tablename__ = "users"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
id: Mapped[PyUUID] = mapped_column(UUID(as_uuid=True), primary_key=True)
|
||||
email: Mapped[str | None] = mapped_column(Text, unique=True, nullable=True)
|
||||
@@ -122,7 +122,7 @@ class WorkspaceMembershipModel(CreatedAtMixin, Base):
|
||||
"""Represent workspace membership with role."""
|
||||
|
||||
__tablename__ = "workspace_memberships"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
workspace_id: Mapped[PyUUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
@@ -150,7 +150,7 @@ class ProjectModel(CreatedAtMixin, UpdatedAtMixin, MetadataMixin, Base):
|
||||
"""Represent a project within a workspace."""
|
||||
|
||||
__tablename__ = "projects"
|
||||
__table_args__: tuple[object, ...] = (
|
||||
__table_args__: ClassVar[tuple[object, ...]] = (
|
||||
# Unique slug per workspace
|
||||
UniqueConstraint("workspace_id", "slug", name="uq_projects_workspace_slug"),
|
||||
{"schema": "noteflow"},
|
||||
@@ -187,7 +187,7 @@ class ProjectMembershipModel(Base):
|
||||
"""Represent project membership with role."""
|
||||
|
||||
__tablename__ = "project_memberships"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
project_id: Mapped[PyUUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import CheckConstraint, ForeignKey, String, Text, UniqueConstraint
|
||||
@@ -30,7 +30,7 @@ class SettingsModel(UuidPrimaryKeyMixin, CreatedAtMixin, UpdatedAtMixin, Base):
|
||||
"""Represent scoped settings (system, workspace, or user level)."""
|
||||
|
||||
__tablename__ = "settings"
|
||||
__table_args__: tuple[UniqueConstraint, CheckConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, CheckConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint(
|
||||
"scope",
|
||||
"workspace_id",
|
||||
@@ -72,7 +72,7 @@ class UserPreferencesModel(UpdatedAtMixin, Base):
|
||||
"""
|
||||
|
||||
__tablename__ = TABLE_USER_PREFERENCES
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
# Using key as primary key (matching schema.sql design for KV store simplicity)
|
||||
key: Mapped[str] = mapped_column(String((2 ** 3) * (2 ** 3)), primary_key=True)
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import (
|
||||
@@ -47,7 +47,7 @@ class IntegrationModel(UuidPrimaryKeyMixin, CreatedAtMixin, UpdatedAtMixin, Base
|
||||
"""Represent an external service integration."""
|
||||
|
||||
__tablename__ = "integrations"
|
||||
__table_args__: tuple[CheckConstraint, CheckConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[CheckConstraint, CheckConstraint, dict[str, str]]] = (
|
||||
CheckConstraint(
|
||||
"type IN ('auth', 'email', 'calendar', 'pkm', 'custom')",
|
||||
name="integrations_type_chk",
|
||||
@@ -98,7 +98,7 @@ class IntegrationSecretModel(CreatedAtMixin, UpdatedAtMixin, Base):
|
||||
"""Store encrypted secrets for an integration."""
|
||||
|
||||
__tablename__ = "integration_secrets"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
integration_id: Mapped[PyUUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
@@ -119,7 +119,7 @@ class IntegrationSyncRunModel(UuidPrimaryKeyMixin, Base):
|
||||
"""Track sync operation history for an integration."""
|
||||
|
||||
__tablename__ = "integration_sync_runs"
|
||||
__table_args__: tuple[CheckConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[CheckConstraint, dict[str, str]]] = (
|
||||
CheckConstraint(
|
||||
"status IN ('running', 'success', 'error')",
|
||||
name="integration_sync_runs_status_chk",
|
||||
@@ -154,7 +154,7 @@ class CalendarEventModel(UuidPrimaryKeyMixin, CreatedAtMixin, UpdatedAtMixin, Ba
|
||||
"""Cache calendar event data from an integration."""
|
||||
|
||||
__tablename__ = "calendar_events"
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint(
|
||||
"integration_id",
|
||||
"external_id",
|
||||
@@ -203,7 +203,7 @@ class MeetingCalendarLinkModel(Base):
|
||||
"""Junction table linking meetings to calendar events."""
|
||||
|
||||
__tablename__ = "meeting_calendar_links"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
meeting_id: Mapped[PyUUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
@@ -231,7 +231,7 @@ class ExternalRefModel(UuidPrimaryKeyMixin, CreatedAtMixin, Base):
|
||||
"""Track references to external entities (generic ID mapping)."""
|
||||
|
||||
__tablename__ = "external_refs"
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint(
|
||||
"integration_id",
|
||||
"entity_type",
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import Boolean, ForeignKey, Integer, String, Text
|
||||
@@ -41,7 +41,7 @@ class WebhookConfigModel(UuidPrimaryKeyMixin, CreatedAtMixin, UpdatedAtMixin, Ba
|
||||
"""
|
||||
|
||||
__tablename__ = "webhook_configs"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
workspace_id: Mapped[PyUUID] = workspace_id_fk_column()
|
||||
name: Mapped[str] = mapped_column(String(255), nullable=False, default=WEBHOOK)
|
||||
@@ -80,7 +80,7 @@ class WebhookDeliveryModel(UuidPrimaryKeyMixin, Base):
|
||||
"""
|
||||
|
||||
__tablename__ = "webhook_deliveries"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
webhook_id: Mapped[PyUUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime
|
||||
from typing import ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import Boolean, DateTime, Float, Integer, String
|
||||
@@ -24,7 +25,7 @@ class UsageEventModel(UuidPrimaryKeyMixin, CreatedAtMixin, Base):
|
||||
"""
|
||||
|
||||
__tablename__ = "usage_events"
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
event_type: Mapped[str] = mapped_column(
|
||||
String(100),
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, ClassVar
|
||||
from uuid import UUID as PyUUID
|
||||
|
||||
from sqlalchemy import ForeignKey, Text, UniqueConstraint
|
||||
@@ -36,7 +36,7 @@ class TagModel(UuidPrimaryKeyMixin, CreatedAtMixin, Base):
|
||||
"""Represent a tag that can be applied to meetings."""
|
||||
|
||||
__tablename__ = "tags"
|
||||
__table_args__: tuple[UniqueConstraint, dict[str, str]] = (
|
||||
__table_args__: ClassVar[tuple[UniqueConstraint, dict[str, str]]] = (
|
||||
UniqueConstraint("workspace_id", "name", name="tags_unique_name_per_workspace"),
|
||||
{"schema": "noteflow"},
|
||||
)
|
||||
@@ -61,7 +61,7 @@ class MeetingTagModel(Base):
|
||||
"""Junction table linking meetings to tags."""
|
||||
|
||||
__tablename__ = TABLE_MEETING_TAGS
|
||||
__table_args__: dict[str, str] = {"schema": "noteflow"}
|
||||
__table_args__: ClassVar[dict[str, str]] = {"schema": "noteflow"}
|
||||
|
||||
meeting_id: Mapped[PyUUID] = mapped_column(
|
||||
UUID(as_uuid=True),
|
||||
|
||||
Reference in New Issue
Block a user