docs: replace lib-focused code review with comprehensive frontend architecture review covering React components, pages, and e2e tests
- Replaced codefixes.md with broader frontend review covering React/TypeScript patterns, Tauri integration, component architecture, and test coverage - Added pages.md documenting critical bugs (React key collision in Analytics pie chart, fire-and-forget error swallowing), API architecture patterns, and code duplication opportunities - Identified must-fix issues: Speaker
This commit is contained in:
@@ -1,86 +1,157 @@
|
||||
Here is a code review of the provided `client/src/lib` directory.
|
||||
Here is a comprehensive code review of the provided React/TypeScript codebase.
|
||||
|
||||
## Executive Summary
|
||||
### **Executive Summary**
|
||||
|
||||
The codebase demonstrates **high maturity** and a strong focus on reliability. There is a clear emphasis on handling "real world" edge cases that often plague desktop/web hybrid apps, such as audio device ID shifting, offline state synchronization, and backward compatibility for encryption keys.
|
||||
The codebase represents a high-quality, modern React application likely built for a Tauri environment (Desktop app). It demonstrates a strong grasp of component composition, state management, and UI consistency using Shadcn/Radix primitives.
|
||||
|
||||
The modularization of the `preferences` system and the robust testing culture (evident in the companion `.test.ts` files for almost every module) are stand-out features.
|
||||
**Strengths:**
|
||||
* **Architecture:** Clear separation of concerns between UI components, business logic hooks, and API layers.
|
||||
* **UX/UI:** Sophisticated handling of loading states, error boundaries, and empty states. Excellent use of Framer Motion for polish.
|
||||
* **Tauri Integration:** Robust handling of IPC events, offline states, and secure storage.
|
||||
* **Testing:** Comprehensive component testing using Vitest and React Testing Library with proper mocking.
|
||||
|
||||
## Strengths
|
||||
**Areas for Improvement:**
|
||||
* Performance optimization for large lists (Logs/Entities).
|
||||
* Reduction of prop drilling in complex settings panels.
|
||||
* Standardization of "Magic Numbers" into constants.
|
||||
* Browser compatibility fallbacks for specific APIs.
|
||||
|
||||
1. **Robust Audio Device Handling (`audio-device-ids.ts`)**
|
||||
* The logic to resolve audio devices when OS-level IDs change (e.g., "Wave Link (2)" becoming "Wave Link (3)") is excellent.
|
||||
* The test suite in `audio-device-persistence.integration.test.ts` is rigorous and covers specific, painful hardware scenarios.
|
||||
---
|
||||
|
||||
2. **Centralized Configuration (`config/`)**
|
||||
* Moving all magic numbers, timeouts, and API endpoints into `app-config.ts` and `timing-constants.ts` makes the application highly tunable and maintainable.
|
||||
### **1. Critical & Functional Issues**
|
||||
|
||||
3. **Resilient Async Operations (`async-utils.ts`)**
|
||||
* The `AsyncQueue` and `StreamingQueue` implementations provide necessary backpressure handling for audio streaming and API calls, preventing memory leaks during network congestion.
|
||||
* The `fireAndForget` wrapper is a safe pattern for non-critical side effects (logging, analytics) that shouldn't crash the app.
|
||||
#### **A. `crypto.randomUUID` Dependency**
|
||||
**File:** `client/src/components/timestamped-notes-editor.tsx` (Line 73)
|
||||
The code uses `crypto.randomUUID()`. While supported in modern browsers and Tauri, it **throws an error** in non-secure contexts (http:// IP addresses other than localhost) or older environments.
|
||||
* **Recommendation:** Use a library like `uuid` or a utility function wrapper that falls back to `Math.random` if `crypto` is unavailable, to prevent the app from crashing in edge-case network environments.
|
||||
|
||||
4. **Sophisticated Logging System**
|
||||
* The logging subsystem (`client-logs.ts`, `log-summarizer.ts`) is surprisingly advanced. It doesn't just dump text; it structures, groups (by meeting/operation), and summarizes repeated events (e.g., "Processed 45 segments"). This significantly aids debugging in production.
|
||||
|
||||
5. **Secure Storage Strategy (`crypto.ts`)**
|
||||
* Using `Web Crypto API` with `PBKDF2` key derivation is the correct standard.
|
||||
* The inclusion of `migrateSecureStorage` shows foresight regarding updating encryption strategies without logging users out.
|
||||
|
||||
## Areas for Improvement & Risks
|
||||
|
||||
### 1. LocalStorage Performance Bottlenecks
|
||||
In several modules (`meeting-cache.ts`, `client-logs.ts`), data is serialized via `JSON.stringify` and written to `localStorage` synchronously.
|
||||
* **Risk:** `client-logs.ts` caps at 500 entries. If logs are large (stack traces), stringifying the entire array on every log insertion will cause frame drops (UI jank).
|
||||
* **Recommendation:** Move heavy logging or caching to `IndexedDB` (using a wrapper like `idb-keyval`) to keep the main thread free, or drastically reduce the synchronous write frequency using a debounce.
|
||||
|
||||
### 2. Crypto Key Persistence (`crypto.ts`)
|
||||
The encryption key is derived from a `DEVICE_ID_KEY` stored in `localStorage`.
|
||||
* **Risk:** If the user clears their browser cache/local storage, the `DEVICE_ID_KEY` is lost. Consequently, the `SECURE_DATA_KEY` (containing API keys) becomes undecryptable garbage.
|
||||
* **Recommendation:** This is an inherent trade-off of browser-based crypto. Ensure the UI handles "Decryption failed" gracefully by prompting the user to re-enter API keys rather than crashing.
|
||||
|
||||
### 3. Preferences Hydration Complexity (`preferences/tauri.ts`)
|
||||
The `hydratePreferencesFromTauri` function attempts to merge server state with local state while protecting "local-only" fields (like `audio_devices`).
|
||||
* **Observation:** The logic relies on `Object.assign` and explicit exclusions.
|
||||
* **Risk:** As the preferences object grows, the list of "local-only" keys must be maintained in multiple places (`preferences-sync.ts` and `tauri.ts`).
|
||||
* **Recommendation:** Define the `UserPreferences` schema such that local-only keys are strictly typed or separated into a distinct sub-object to automate this merging logic.
|
||||
|
||||
### 4. Meeting Cache Writes (`meeting-cache.ts`)
|
||||
* **Issue:** The cache writes the *entire* meeting map to storage on every update.
|
||||
* **Risk:** For users with many long meetings, this JSON blob could exceed the 5MB localStorage limit or cause performance issues.
|
||||
* **Recommendation:** Consider storing only metadata in the monolithic cache key and storing individual meeting segments in separate keys (e.g., `meeting_segments_${id}`) or switching to IndexedDB.
|
||||
|
||||
## Code-Specific Feedback
|
||||
|
||||
### `client/src/lib/async-utils.ts`
|
||||
The `fireAndForget` function is good, but `options?.metadata` should ideally merge with context from the error if the error is typed.
|
||||
#### **B. Client-Side Filtering on Potentially Large Datasets**
|
||||
**File:** `client/src/components/analytics/logs-tab.tsx` (Line 113)
|
||||
The filtering of logs happens entirely on the client side:
|
||||
```typescript
|
||||
// Current
|
||||
const message = err instanceof Error ? err.message : String(err);
|
||||
|
||||
// Suggestion: Extract stack trace for debug level
|
||||
const details = err instanceof Error ? err.stack : String(err);
|
||||
const filteredLogs = useMemo(() => { ... }, [mergedLogs, ...]);
|
||||
```
|
||||
If the application runs for a long time, `mergedLogs` could grow to thousands of entries. Filtering this array on every keystroke in the search box will cause UI jank.
|
||||
* **Recommendation:**
|
||||
1. Implement **pagination** or **virtualization** (e.g., `react-virtuoso`) for the `ScrollArea`.
|
||||
2. Debounce the search input state update.
|
||||
|
||||
### `client/src/lib/ai-providers/fetch-models.ts`
|
||||
The `fetchModels` function uses a large switch statement.
|
||||
* **Refactor:** Consider a strategy pattern or a configuration object mapping providers to their fetch implementations. This would make `ai-providers/index.ts` cleaner and easier to extend.
|
||||
#### **C. Memory Leak Risk in `AudioLevelMeter`**
|
||||
**File:** `client/src/components/recording/audio-level-meter.tsx` (Lines 45-49)
|
||||
The component sets an interval to generate random levels when no specific level is provided but `isActive` is true. While `useEffect` cleans it up, if `isActive` toggles rapidly or the component re-renders frequently due to parent state changes, this could cause jitter.
|
||||
* **Refinement:** Ensure `isActive` is stable. The current implementation is safe regarding memory leaks due to the cleanup function, but the logic `if (typeof level === 'number')` inside the effect might cause the interval to be set/cleared rapidly if `level` fluctuates between number and undefined (unlikely but possible).
|
||||
|
||||
### `client/src/lib/preferences-sync.ts`
|
||||
In `pushToServer`:
|
||||
---
|
||||
|
||||
### **2. Code Quality & Refactoring**
|
||||
|
||||
#### **A. Complex Component Decomposition**
|
||||
**File:** `client/src/components/settings/ai-config-section.tsx`
|
||||
This component is becoming a "God Component." It manages state for Transcription, Summary, and Embedding configurations simultaneously, including fetching states, testing states, and secure storage loading.
|
||||
* **Refactor:** Create a reusable custom hook `useAIProviderConfig(configType)` that handles the state, loading, and API calls for a single provider type. This would reduce the main component code by ~50%.
|
||||
|
||||
#### **B. Prop Drilling in Settings**
|
||||
**File:** `client/src/components/settings/integrations-section/index.tsx` -> `IntegrationItem.tsx` -> `IntegrationConfigPanel`
|
||||
Configuration handlers and state are passed down through several layers.
|
||||
* **Recommendation:** Since this is a settings page, utilizing a context (`IntegrationSettingsContext`) would clean up the component signatures significantly and make adding new actions (like "Test Connection") easier to implement without touching intermediate components.
|
||||
|
||||
#### **C. Magic Numbers**
|
||||
**File:** `client/src/components/sync-history-log.tsx`
|
||||
```typescript
|
||||
// Lines 246-251
|
||||
const response = await invoke<SetPreferencesResult>('set_preferences_sync', {
|
||||
preferences: encoded,
|
||||
if_match: options?.force ? null : meta.etag,
|
||||
const interval = setInterval(loadHistory, 30000); // 30 seconds
|
||||
```
|
||||
While `Timing.THIRTY_SECONDS_MS` is imported, there are still raw numbers used in some places (e.g., `client/src/components/recording/audio-level-meter.tsx` uses `100` for interval).
|
||||
* **Recommendation:** Move all timing constants to `@/api/constants` or a local `consts.ts` file to ensure consistency across the app.
|
||||
|
||||
---
|
||||
|
||||
### **3. UI/UX & Accessibility**
|
||||
|
||||
#### **A. Accessible Icon Buttons**
|
||||
**File:** `client/src/components/recording/audio-device-selector.tsx` (and others)
|
||||
```typescript
|
||||
<Button variant="outline" size="sm" ...>
|
||||
<Mic className={iconSize.sm} />
|
||||
</Button>
|
||||
```
|
||||
When in `compact` mode, this button has no text.
|
||||
* **Fix:** Ensure all icon-only buttons have `aria-label` props or `<span className="sr-only">Description</span>` for screen readers. The `SidebarMenuButton` handles tooltips well, but ad-hoc buttons in headers often miss this.
|
||||
|
||||
#### **B. Feedback during "Save" Operations**
|
||||
**File:** `client/src/components/settings/advanced-local-ai-settings/model-auth-section.tsx`
|
||||
When saving the token, the button shows a spinner. However, there is no success toast or visual feedback *after* the save completes, only if it fails (via `error` prop) or implicitly by the input clearing.
|
||||
* **Recommendation:** Explicitly trigger a `toast({ title: "Saved" })` upon success to give user closure.
|
||||
|
||||
---
|
||||
|
||||
### **4. Security Considerations**
|
||||
|
||||
#### **A. Sensitive Data in Logs**
|
||||
**File:** `client/src/components/analytics/log-entry.tsx`
|
||||
```typescript
|
||||
<pre className="text-xs bg-muted/50 p-2 rounded overflow-x-auto">
|
||||
{JSON.stringify(log.metadata, null, 2)}
|
||||
</pre>
|
||||
```
|
||||
* **Risk:** If `metadata` contains PII or tokens (even accidental), it is rendered in plain text in the UI (and potentially exported).
|
||||
* **Mitigation:** Implement a `sanitizeMetadata` utility that runs before rendering or exporting logs, redacting keys like `api_key`, `token`, `secret`, `authorization`.
|
||||
|
||||
#### **B. Secure Storage Fallback**
|
||||
**File:** `client/src/components/settings/ai-config-section.tsx`
|
||||
The component checks `isSecureStorageAvailable()`. If false, it simply returns.
|
||||
* **UX Issue:** If secure storage fails (e.g., on a specific browser or OS restriction), the user can't save API keys at all.
|
||||
* **Recommendation:** Fallback to `localStorage` (with a warning/consent dialog) or in-memory storage if secure storage is unavailable, rather than failing silently.
|
||||
|
||||
---
|
||||
|
||||
### **5. Test Coverage**
|
||||
|
||||
The codebase has excellent test coverage patterns.
|
||||
* **Good:** `client/src/components/analytics/logs-tab.test.tsx` mocks `ResizeObserver` and Time/Date functions effectively.
|
||||
* **Good:** `client/src/components/recording/audio-device-selector.test.tsx` tests permission states.
|
||||
* **Missing:** There are no tests for `client/src/components/settings/integrations-section/use-integration-handlers.ts`. This hook contains complex logic regarding OAuth flows and state updates. It is a critical path and should be unit tested.
|
||||
|
||||
---
|
||||
|
||||
### **6. Specific Code Fixes**
|
||||
|
||||
**Fix for `client/src/components/timestamped-notes-editor.tsx` (UUID issue):**
|
||||
|
||||
```typescript
|
||||
// Replace crypto.randomUUID() usage
|
||||
function generateId() {
|
||||
if (typeof crypto !== 'undefined' && crypto.randomUUID) {
|
||||
return crypto.randomUUID();
|
||||
}
|
||||
// Simple fallback for non-secure contexts
|
||||
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {
|
||||
const r = Math.random() * 16 | 0, v = c == 'x' ? r : (r & 0x3 | 0x8);
|
||||
return v.toString(16);
|
||||
});
|
||||
}
|
||||
|
||||
// Usage in saveNote:
|
||||
const newNote: NoteEdit = {
|
||||
id: generateId(),
|
||||
// ...
|
||||
});
|
||||
};
|
||||
```
|
||||
* **Note:** The optimistic locking via `etag` is a great implementation detail for preventing overwrite conflicts.
|
||||
|
||||
### `client/src/lib/event-emitter.ts`
|
||||
The `createMultiEventEmitter` is quite useful.
|
||||
* **Nitpick:** The error handling inside `emit` catches errors from listeners. This is good for stability, but ensure `addClientLog` doesn't create a recursive loop if the event being emitted is a log event.
|
||||
**Optimization for `client/src/components/recording/speaker-distribution.tsx`:**
|
||||
|
||||
## Conclusion
|
||||
The `SpeakerDistribution` re-calculates `speakerCounts` and sorts `speakers` on every render. While `useMemo` is used, ensuring the `segments` array reference is stable in the parent component is crucial to prevent unnecessary recalculations during active recording.
|
||||
|
||||
This is a **high-quality codebase**. The developers have successfully abstracted away the complexity of the Tauri bridge, hardware interactions, and state synchronization. The primary concern is scalability regarding `localStorage` usage as the dataset (logs/meetings) grows. Transitioning specific heavy stores to IndexedDB would be the next logical architectural step.
|
||||
```typescript
|
||||
// Ensure parent passes a stable reference or memoize efficiently
|
||||
const speakerCounts = useMemo(() => {
|
||||
// Logic remains same, but ensure this isn't running on every audio frame update
|
||||
// passed via props.
|
||||
}, [segments]);
|
||||
```
|
||||
|
||||
### **Conclusion**
|
||||
|
||||
The code is production-ready with minor adjustments. It adheres to modern React standards. The primary focus for the next sprint should be:
|
||||
1. **Hardening:** UUID fallback and Metadata sanitization.
|
||||
2. **Performance:** Virtualization for the Logs tab.
|
||||
3. **Refactoring:** Breaking down the AI Config settings into smaller hooks/components.
|
||||
136
.claudectx/pages.md
Normal file
136
.claudectx/pages.md
Normal file
@@ -0,0 +1,136 @@
|
||||
Here’s a *real* code review based on the Repomix snapshot you uploaded (frontend pages + a bunch of client/src + tauri + backend index listing).
|
||||
|
||||
## 🚨 Must-fix bugs / correctness issues
|
||||
|
||||
### 1) React key bug in Analytics “Speaker Participation” pie
|
||||
|
||||
You render pie cells with `key={`cell-${stat.speaker}`}` but `SpeakerStats` uses `speakerId`, not `speaker`. This will produce duplicate keys (`cell-undefined`) and can break reconciliation.
|
||||
|
||||
**Fix**
|
||||
|
||||
```tsx
|
||||
{analytics.speakerStats.map((stat, idx) => (
|
||||
<Cell
|
||||
key={`cell-${stat.speakerId}`}
|
||||
fill={SPEAKER_COLORS[idx % SPEAKER_COLORS.length]}
|
||||
/>
|
||||
))}
|
||||
```
|
||||
|
||||
(Also removes the `indexOf(stat)` O(n²) pattern.)
|
||||
|
||||
---
|
||||
|
||||
### 2) “RefreshKey” hack in PeoplePage is a smell (and currently pointless)
|
||||
|
||||
Inside `useMemo`, you have `void refreshKey;` which does nothing except silence “unused” linting.
|
||||
|
||||
If you want rerenders after `preferences.setGlobalSpeakerName`, you should subscribe to the preferences store (or use a store hook) instead of manually bumping a counter.
|
||||
|
||||
**Better**
|
||||
|
||||
* Add `preferences.subscribe(...)` and store speaker names in state, or
|
||||
* Expose a proper `usePreferences()` hook that triggers updates, or
|
||||
* Use `useSyncExternalStore` for preferences.
|
||||
|
||||
At minimum: **delete** the `void refreshKey;` line and keep the dependency array.
|
||||
|
||||
---
|
||||
|
||||
## 🧠 API & state architecture review (Tauri / cached / mock)
|
||||
|
||||
### 3) Fire-and-forget is everywhere — keep it, but stop swallowing errors silently
|
||||
|
||||
Patterns like:
|
||||
|
||||
* `.catch(() => {})` in event bridges / invoke calls (fire-and-forget)
|
||||
* `void startProcessing(meetingId)` (post-processing)
|
||||
…are okay **only** for high-frequency, non-critical operations, *but you need an error sink*.
|
||||
|
||||
Right now, a failure to `SEND_AUDIO_CHUNK` (or similar) becomes invisible. That’s how you end up with “backend never logs anything” and no idea why.
|
||||
|
||||
**Recommendation (small, high impact):**
|
||||
Create a single helper used everywhere:
|
||||
|
||||
```ts
|
||||
export function fireAndForget(p: Promise<unknown>, meta: { op: string }) {
|
||||
p.catch((err) => addClientLog({
|
||||
level: 'warning',
|
||||
source: 'api',
|
||||
message: `Fire-and-forget failed: ${meta.op}`,
|
||||
details: err instanceof Error ? err.message : String(err),
|
||||
}));
|
||||
}
|
||||
```
|
||||
|
||||
Then replace `promise.catch(()=>{})` / `void promise` with `fireAndForget(...)`.
|
||||
|
||||
This preserves your non-blocking UX **without** silent failures.
|
||||
|
||||
---
|
||||
|
||||
### 4) Connection/reconnection logic is solid conceptually, but watch retry backoff math
|
||||
|
||||
Your reconnection code increments attempts on failure and then computes delay from attempts. Ensure you’re not “double counting” attempts when scheduling next delay (common bug when you do `attempts + 1` in more than one place). (I’m flagging this because the pattern in the snapshot looks prone to it.)
|
||||
|
||||
---
|
||||
|
||||
## ♻️ Duplication & consolidation opportunities (high ROI)
|
||||
|
||||
### 5) Export logic is duplicated (and error-prone)
|
||||
|
||||
You implement base64→bytes→Blob in multiple places (Meeting Detail export, Diagnostics export). Consolidate to one utility:
|
||||
|
||||
* `buildExportBlob(format, content)`
|
||||
* `downloadBlob(filename, blob)`
|
||||
|
||||
Also: revoking `URL.createObjectURL(blob)` immediately after `a.click()` can intermittently break downloads in some browsers. Prefer:
|
||||
|
||||
```ts
|
||||
a.click();
|
||||
setTimeout(() => URL.revokeObjectURL(url), 1000);
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 6) Multiple enum-mapping “sources of truth”
|
||||
|
||||
You have multiple mapping systems (grpc ↔ domain values) across helpers. This is exactly where drift happens (and it’s hard to detect unless you have contract tests).
|
||||
|
||||
**Recommendation**
|
||||
|
||||
* Make a single `enums.ts` mapping module per boundary (gRPC boundary vs UI boundary)
|
||||
* Export **typed** conversion fns (don’t accept `string`, accept `MeetingState`, `AnnotationType`, etc.)
|
||||
* Add a test that asserts mappings cover *every* union member (exhaustive).
|
||||
|
||||
---
|
||||
|
||||
## 🧹 Dead/unused code and small cleanups
|
||||
|
||||
### 7) Analytics performance & clarity
|
||||
|
||||
* Avoid `indexOf(stat)` inside `.map()` (O(n²)); use the `map` index. (See fix above.)
|
||||
* Use consistent IDs: your chart gradients use static IDs like `"durationGradient"`, which can collide if multiple charts render on the same page. Consider prefixing with a stable instance ID.
|
||||
|
||||
### 8) Meeting detail + processing hooks look good, but verify repomix artifacts
|
||||
|
||||
Some snippets show clearly broken tokens (e.g., `.gridProps`, `.speakers`, `setMeeting({ .meeting, summary })`). Those *might* be Repomix formatting artifacts, but if they exist in your real source they’re compile breakers. Spot-check those exact files locally.
|
||||
|
||||
---
|
||||
|
||||
## ✅ What you’re doing well
|
||||
|
||||
* **Separation of adapters** (mock/cached/tauri) is clean and test-covered.
|
||||
* **Guarded mutations** for offline-mode UX are a strong pattern.
|
||||
* The Meeting Detail page is structured sensibly (Header, Transcript, Summary/Entities tabs).
|
||||
|
||||
---
|
||||
|
||||
## If you want the highest-impact next PR
|
||||
|
||||
1. Fix the Analytics pie key bug.
|
||||
2. Add a `fireAndForget()` helper + replace silent catches.
|
||||
3. Centralize export blob + download helper and reuse it.
|
||||
4. Remove the `void refreshKey` hack and move to a real preferences subscription/store.
|
||||
|
||||
If you tell me which area you care about most (UI pages vs API adapters vs Tauri bridge vs backend python services), I’ll go deeper on that slice and produce a punch-list that’s basically “open PRs” ready.
|
||||
@@ -7,7 +7,14 @@
|
||||
},
|
||||
"files": {
|
||||
"ignoreUnknown": false,
|
||||
"includes": ["**", "!**/dist", "!**/node_modules", "!**/src-tauri/target", "!**/*.gen.ts", "!**/src-tauri/src/*.html"]
|
||||
"includes": [
|
||||
"**",
|
||||
"!**/dist",
|
||||
"!**/node_modules",
|
||||
"!**/src-tauri/target",
|
||||
"!**/*.gen.ts",
|
||||
"!**/src-tauri/src/*.html"
|
||||
]
|
||||
},
|
||||
"overrides": [
|
||||
{
|
||||
|
||||
@@ -885,13 +885,13 @@ describe('audio: recording flow with hardware', () => {
|
||||
interval: IntegrationTimeouts.POLLING_INTERVAL_MS,
|
||||
}
|
||||
);
|
||||
} catch {
|
||||
// Recording failed - this is OK without audio hardware
|
||||
process.stdout.write('Recording did not start - likely no audio permission or device\n');
|
||||
const hasStartButton = await isLabelDisplayed('Start Recording');
|
||||
expect(hasStartButton).toBe(true);
|
||||
return;
|
||||
}
|
||||
} catch {
|
||||
// Recording failed - this is OK without audio hardware
|
||||
process.stdout.write('Recording did not start - likely no audio permission or device\n');
|
||||
const hasStartButton = await isLabelDisplayed('Start Recording');
|
||||
expect(hasStartButton).toBe(true);
|
||||
return;
|
||||
}
|
||||
|
||||
if (!recordingActive) {
|
||||
return;
|
||||
@@ -901,11 +901,11 @@ describe('audio: recording flow with hardware', () => {
|
||||
await browser.pause(AudioTestTimeouts.AUDIO_RECORDING_MS);
|
||||
|
||||
// Check for audio level visualization during recording
|
||||
const hasAudioLevelIndicator =
|
||||
(await isLabelDisplayed('Audio Level')) ||
|
||||
(await isLabelDisplayed('VU')) ||
|
||||
(await isLabelDisplayed('Input Level'));
|
||||
expect(typeof hasAudioLevelIndicator).toBe('boolean');
|
||||
const hasAudioLevelIndicator =
|
||||
(await isLabelDisplayed('Audio Level')) ||
|
||||
(await isLabelDisplayed('VU')) ||
|
||||
(await isLabelDisplayed('Input Level'));
|
||||
expect(typeof hasAudioLevelIndicator).toBe('boolean');
|
||||
|
||||
// Stop recording
|
||||
await clickByLabel('Stop Recording');
|
||||
|
||||
@@ -128,7 +128,11 @@ describe('Preferences', () => {
|
||||
});
|
||||
|
||||
it('should load user preferences', async () => {
|
||||
const result = await executeInApp<{ success?: boolean; prefs?: Record<string, unknown>; error?: string }>({
|
||||
const result = await executeInApp<{
|
||||
success?: boolean;
|
||||
prefs?: Record<string, unknown>;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'getPreferences',
|
||||
});
|
||||
|
||||
|
||||
@@ -15,9 +15,11 @@ describe('Server Connection', () => {
|
||||
|
||||
describe('isConnected', () => {
|
||||
it('should return connection status', async () => {
|
||||
const result = await executeInApp<{ success?: boolean; connected?: boolean; error?: string }>({
|
||||
type: 'isConnected',
|
||||
});
|
||||
const result = await executeInApp<{ success?: boolean; connected?: boolean; error?: string }>(
|
||||
{
|
||||
type: 'isConnected',
|
||||
}
|
||||
);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(typeof result.connected).toBe('boolean');
|
||||
@@ -26,7 +28,11 @@ describe('Server Connection', () => {
|
||||
|
||||
describe('getServerInfo', () => {
|
||||
it('should return server information when connected', async () => {
|
||||
const result = await executeInApp<{ success?: boolean; info?: Record<string, unknown>; error?: string }>({
|
||||
const result = await executeInApp<{
|
||||
success?: boolean;
|
||||
info?: Record<string, unknown>;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'getServerInfo',
|
||||
});
|
||||
|
||||
@@ -39,7 +45,11 @@ describe('Server Connection', () => {
|
||||
|
||||
describe('connect', () => {
|
||||
it('should connect to server with default URL', async () => {
|
||||
const result = await executeInApp<{ success?: boolean; info?: Record<string, unknown>; error?: string }>({
|
||||
const result = await executeInApp<{
|
||||
success?: boolean;
|
||||
info?: Record<string, unknown>;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'connectDefault',
|
||||
});
|
||||
|
||||
@@ -66,7 +76,11 @@ describe('Identity', () => {
|
||||
|
||||
describe('getCurrentUser', () => {
|
||||
it('should return current user info', async () => {
|
||||
const result = await executeInApp<{ success?: boolean; user?: Record<string, unknown>; error?: string }>({
|
||||
const result = await executeInApp<{
|
||||
success?: boolean;
|
||||
user?: Record<string, unknown>;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'getCurrentUser',
|
||||
});
|
||||
|
||||
@@ -105,9 +119,11 @@ describe('Projects', () => {
|
||||
|
||||
describe('listProjects', () => {
|
||||
it('should list projects', async () => {
|
||||
const workspaces = await executeInApp<{ workspaces?: Array<{ id: string }>; error?: string }>({
|
||||
type: 'listWorkspaces',
|
||||
});
|
||||
const workspaces = await executeInApp<{ workspaces?: Array<{ id: string }>; error?: string }>(
|
||||
{
|
||||
type: 'listWorkspaces',
|
||||
}
|
||||
);
|
||||
if (!workspaces?.workspaces?.length) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -152,7 +152,6 @@ describe('Speaker Diarization', () => {
|
||||
await api?.deleteMeeting(id);
|
||||
}, meeting.id);
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
describe('cancelDiarization', () => {
|
||||
|
||||
@@ -104,6 +104,5 @@ describe('Export Operations', () => {
|
||||
// Cleanup
|
||||
await executeInApp({ type: 'deleteMeeting', meetingId: meeting.id });
|
||||
});
|
||||
|
||||
});
|
||||
});
|
||||
|
||||
@@ -76,7 +76,11 @@ export type AppAction =
|
||||
| { type: 'connect'; serverUrl: string }
|
||||
| { type: 'resetRecordingState' }
|
||||
| { type: 'updatePreferences'; updates: Record<string, unknown> }
|
||||
| { type: 'forceConnectionState'; mode: 'connected' | 'disconnected' | 'cached' | 'mock'; serverUrl?: string | null }
|
||||
| {
|
||||
type: 'forceConnectionState';
|
||||
mode: 'connected' | 'disconnected' | 'cached' | 'mock';
|
||||
serverUrl?: string | null;
|
||||
}
|
||||
| {
|
||||
type: 'listMeetings';
|
||||
states?: Array<'created' | 'recording' | 'stopped' | 'completed' | 'error'>;
|
||||
@@ -233,7 +237,11 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
return testInvoke;
|
||||
}
|
||||
const tauri = (window as { __TAURI__?: unknown }).__TAURI__ as
|
||||
| { core?: { invoke?: (cmd: string, args?: Record<string, unknown>) => Promise<unknown> } }
|
||||
| {
|
||||
core?: {
|
||||
invoke?: (cmd: string, args?: Record<string, unknown>) => Promise<unknown>;
|
||||
};
|
||||
}
|
||||
| { invoke?: (cmd: string, args?: Record<string, unknown>) => Promise<unknown> }
|
||||
| undefined;
|
||||
if (!tauri) {
|
||||
@@ -248,7 +256,9 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
return null;
|
||||
};
|
||||
|
||||
const normalizeInjectResult = (result: unknown): { chunksSent: number; durationSeconds: number } | null => {
|
||||
const normalizeInjectResult = (
|
||||
result: unknown
|
||||
): { chunksSent: number; durationSeconds: number } | null => {
|
||||
if (!result || typeof result !== 'object') {
|
||||
return null;
|
||||
}
|
||||
@@ -270,8 +280,9 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
return;
|
||||
}
|
||||
case 'resetRecordingState': {
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown })
|
||||
.__NOTEFLOW_TEST_API__ as { resetRecordingState?: () => Promise<unknown> } | undefined;
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown }).__NOTEFLOW_TEST_API__ as
|
||||
| { resetRecordingState?: () => Promise<unknown> }
|
||||
| undefined;
|
||||
if (typeof testApi?.resetRecordingState === 'function') {
|
||||
await testApi.resetRecordingState();
|
||||
finish({ success: true });
|
||||
@@ -287,8 +298,9 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
return;
|
||||
}
|
||||
case 'updatePreferences': {
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown })
|
||||
.__NOTEFLOW_TEST_API__ as { updatePreferences?: (updates: Record<string, unknown>) => void } | undefined;
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown }).__NOTEFLOW_TEST_API__ as
|
||||
| { updatePreferences?: (updates: Record<string, unknown>) => void }
|
||||
| undefined;
|
||||
if (typeof testApi?.updatePreferences !== 'function') {
|
||||
try {
|
||||
const raw = localStorage.getItem('noteflow_preferences');
|
||||
@@ -306,8 +318,7 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
return;
|
||||
}
|
||||
case 'forceConnectionState': {
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown })
|
||||
.__NOTEFLOW_TEST_API__ as
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown }).__NOTEFLOW_TEST_API__ as
|
||||
| {
|
||||
forceConnectionState?: (
|
||||
mode: 'connected' | 'disconnected' | 'cached' | 'mock',
|
||||
@@ -351,7 +362,10 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
return;
|
||||
}
|
||||
case 'createMeeting': {
|
||||
const meeting = await api.createMeeting({ title: payload.title, metadata: payload.metadata });
|
||||
const meeting = await api.createMeeting({
|
||||
title: payload.title,
|
||||
metadata: payload.metadata,
|
||||
});
|
||||
finish(meeting);
|
||||
return;
|
||||
}
|
||||
@@ -522,7 +536,10 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
} catch (error) {
|
||||
const message = extractErrorMessage(error);
|
||||
const normalized = message.toLowerCase();
|
||||
if (normalized.includes('already streaming') || normalized.includes('already recording')) {
|
||||
if (
|
||||
normalized.includes('already streaming') ||
|
||||
normalized.includes('already recording')
|
||||
) {
|
||||
alreadyRecording = true;
|
||||
} else {
|
||||
finish({ success: false, error: message });
|
||||
@@ -530,8 +547,7 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
}
|
||||
}
|
||||
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown })
|
||||
.__NOTEFLOW_TEST_API__ as
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown }).__NOTEFLOW_TEST_API__ as
|
||||
| {
|
||||
injectTestTone?: (
|
||||
meetingId: string,
|
||||
@@ -586,27 +602,32 @@ export async function executeInApp<TResult>(action: AppAction): Promise<TResult>
|
||||
return;
|
||||
}
|
||||
case 'startTranscriptionWithInjection': {
|
||||
let alreadyRecording = false;
|
||||
try {
|
||||
const stream = await api.startTranscription(payload.meetingId);
|
||||
const streamStore = getStreamStore();
|
||||
streamStore[payload.meetingId] = stream;
|
||||
} catch (error) {
|
||||
const message = extractErrorMessage(error);
|
||||
const normalized = message.toLowerCase();
|
||||
if (normalized.includes('already streaming') || normalized.includes('already recording')) {
|
||||
// Treat as already-active stream and continue with injection.
|
||||
alreadyRecording = true;
|
||||
} else {
|
||||
finish({ success: false, error: message });
|
||||
return;
|
||||
let alreadyRecording = false;
|
||||
try {
|
||||
const stream = await api.startTranscription(payload.meetingId);
|
||||
const streamStore = getStreamStore();
|
||||
streamStore[payload.meetingId] = stream;
|
||||
} catch (error) {
|
||||
const message = extractErrorMessage(error);
|
||||
const normalized = message.toLowerCase();
|
||||
if (
|
||||
normalized.includes('already streaming') ||
|
||||
normalized.includes('already recording')
|
||||
) {
|
||||
// Treat as already-active stream and continue with injection.
|
||||
alreadyRecording = true;
|
||||
} else {
|
||||
finish({ success: false, error: message });
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown })
|
||||
.__NOTEFLOW_TEST_API__ as
|
||||
const testApi = (window as { __NOTEFLOW_TEST_API__?: unknown }).__NOTEFLOW_TEST_API__ as
|
||||
| {
|
||||
injectTestAudio?: (meetingId: string, config: { wavPath: string; speed: number; chunkMs: number }) => Promise<unknown>;
|
||||
injectTestAudio?: (
|
||||
meetingId: string,
|
||||
config: { wavPath: string; speed: number; chunkMs: number }
|
||||
) => Promise<unknown>;
|
||||
injectTestTone?: (
|
||||
meetingId: string,
|
||||
frequency: number,
|
||||
|
||||
5
client/e2e-native/globals.d.ts
vendored
5
client/e2e-native/globals.d.ts
vendored
@@ -38,7 +38,10 @@ declare global {
|
||||
) => Promise<unknown>;
|
||||
isE2EMode?: () => string | undefined;
|
||||
updatePreferences?: (updates: Record<string, unknown>) => void;
|
||||
forceConnectionState?: (mode: 'connected' | 'disconnected' | 'cached' | 'mock', serverUrl?: string | null) => void;
|
||||
forceConnectionState?: (
|
||||
mode: 'connected' | 'disconnected' | 'cached' | 'mock',
|
||||
serverUrl?: string | null
|
||||
) => void;
|
||||
resetRecordingState?: () => Promise<unknown>;
|
||||
};
|
||||
|
||||
|
||||
@@ -32,7 +32,12 @@ async function listMeetings(
|
||||
limit = MEETING_LIST_LIMIT,
|
||||
offset = 0
|
||||
) {
|
||||
const result = await executeInApp<ListMeetingsResult>({ type: 'listMeetings', states, limit, offset });
|
||||
const result = await executeInApp<ListMeetingsResult>({
|
||||
type: 'listMeetings',
|
||||
states,
|
||||
limit,
|
||||
offset,
|
||||
});
|
||||
if (isErrorResult(result)) {
|
||||
throw new Error(`listMeetings failed: ${result.error}`);
|
||||
}
|
||||
@@ -86,9 +91,7 @@ function assertRecentMeeting(
|
||||
): void {
|
||||
const createdAt = meeting.created_at ?? 0;
|
||||
if (minCreatedAt && createdAt < minCreatedAt) {
|
||||
throw new Error(
|
||||
`Latest meeting predates scenario start (created_at=${createdAt.toFixed(1)}s)`
|
||||
);
|
||||
throw new Error(`Latest meeting predates scenario start (created_at=${createdAt.toFixed(1)}s)`);
|
||||
}
|
||||
const ageSeconds = Date.now() / 1000 - createdAt;
|
||||
if (!createdAt || ageSeconds > maxAgeSeconds) {
|
||||
@@ -146,11 +149,7 @@ async function stopMeetingIfRecording(meetingId: string): Promise<void> {
|
||||
}
|
||||
}
|
||||
|
||||
async function startTone(
|
||||
meetingId: string,
|
||||
tone = TONE,
|
||||
options?: { waitForRecording?: boolean }
|
||||
) {
|
||||
async function startTone(meetingId: string, tone = TONE, options?: { waitForRecording?: boolean }) {
|
||||
const result = await executeInApp({ type: 'startTranscriptionWithTone', meetingId, tone });
|
||||
if (!result.success) {
|
||||
throw new Error(`Tone injection failed: ${result.error ?? 'unknown error'}`);
|
||||
@@ -198,7 +197,11 @@ describe('Lifecycle stress tests', () => {
|
||||
timeoutMsg: 'Test API not available within 15s',
|
||||
}
|
||||
);
|
||||
const prefsResult = await executeInApp<{ success?: boolean; error?: string; needsReload?: boolean }>({
|
||||
const prefsResult = await executeInApp<{
|
||||
success?: boolean;
|
||||
error?: string;
|
||||
needsReload?: boolean;
|
||||
}>({
|
||||
type: 'updatePreferences',
|
||||
updates: { simulate_transcription: false, skip_simulation_confirmation: true },
|
||||
});
|
||||
@@ -316,7 +319,9 @@ describe('Lifecycle stress tests', () => {
|
||||
name: 'Start then immediate stop before injection completes',
|
||||
async run() {
|
||||
await ensureNoActiveRecordings();
|
||||
const meeting = await createMeeting(`Lifecycle immediate stop ${TestData.generateTestId()}`);
|
||||
const meeting = await createMeeting(
|
||||
`Lifecycle immediate stop ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(meeting.id);
|
||||
await startTone(meeting.id, { ...TONE, seconds: 2 });
|
||||
await executeInApp({ type: 'stopMeeting', meetingId: meeting.id });
|
||||
@@ -329,7 +334,9 @@ describe('Lifecycle stress tests', () => {
|
||||
name: 'Double start on same meeting should not crash',
|
||||
async run() {
|
||||
await ensureNoActiveRecordings();
|
||||
const meeting = await createMeeting(`Lifecycle double start ${TestData.generateTestId()}`);
|
||||
const meeting = await createMeeting(
|
||||
`Lifecycle double start ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(meeting.id);
|
||||
await startTone(meeting.id);
|
||||
const secondStart = await executeInApp({
|
||||
@@ -385,7 +392,9 @@ describe('Lifecycle stress tests', () => {
|
||||
}
|
||||
await waitForMeetingState(meeting.id, ['stopped', 'completed']);
|
||||
// Evidence
|
||||
console.log(`[e2e-lifecycle] stop-active: stopped=${result.stopped ?? 0} meeting=${meeting.id}`);
|
||||
console.log(
|
||||
`[e2e-lifecycle] stop-active: stopped=${result.stopped ?? 0} meeting=${meeting.id}`
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
@@ -438,7 +447,9 @@ describe('Lifecycle stress tests', () => {
|
||||
name: 'Delete meeting while recording does not leave an active recording behind',
|
||||
async run() {
|
||||
await ensureNoActiveRecordings();
|
||||
const meeting = await createMeeting(`Lifecycle delete-active ${TestData.generateTestId()}`);
|
||||
const meeting = await createMeeting(
|
||||
`Lifecycle delete-active ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(meeting.id);
|
||||
await startTone(meeting.id);
|
||||
await deleteMeeting(meeting.id);
|
||||
@@ -540,7 +551,9 @@ describe('Lifecycle stress tests', () => {
|
||||
await executeInApp({ type: 'stopMeeting', meetingId: meeting.id });
|
||||
await waitForMeetingState(meeting.id, ['stopped', 'completed']);
|
||||
// Evidence
|
||||
console.log(`[e2e-lifecycle] annotation-live: meeting=${meeting.id} annotation=${annotationId}`);
|
||||
console.log(
|
||||
`[e2e-lifecycle] annotation-live: meeting=${meeting.id} annotation=${annotationId}`
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
@@ -552,7 +565,11 @@ describe('Lifecycle stress tests', () => {
|
||||
await startTone(meeting.id, { ...TONE, seconds: 2 });
|
||||
await executeInApp({ type: 'stopMeeting', meetingId: meeting.id });
|
||||
await waitForMeetingState(meeting.id, ['stopped', 'completed']);
|
||||
const summary = await executeInApp({ type: 'generateSummary', meetingId: meeting.id, force: true });
|
||||
const summary = await executeInApp({
|
||||
type: 'generateSummary',
|
||||
meetingId: meeting.id,
|
||||
force: true,
|
||||
});
|
||||
if (isErrorResult(summary)) {
|
||||
throw new Error(`Summary generation failed: ${summary.error}`);
|
||||
}
|
||||
@@ -564,11 +581,17 @@ describe('Lifecycle stress tests', () => {
|
||||
name: 'Concurrent stop and summary requests do not crash',
|
||||
async run() {
|
||||
await ensureNoActiveRecordings();
|
||||
const meeting = await createMeeting(`Lifecycle stop-summary ${TestData.generateTestId()}`);
|
||||
const meeting = await createMeeting(
|
||||
`Lifecycle stop-summary ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(meeting.id);
|
||||
await startTone(meeting.id, { ...TONE, seconds: 2 });
|
||||
const stopPromise = executeInApp({ type: 'stopMeeting', meetingId: meeting.id });
|
||||
const summaryPromise = executeInApp({ type: 'generateSummary', meetingId: meeting.id, force: true });
|
||||
const summaryPromise = executeInApp({
|
||||
type: 'generateSummary',
|
||||
meetingId: meeting.id,
|
||||
force: true,
|
||||
});
|
||||
await Promise.all([stopPromise, summaryPromise]);
|
||||
await waitForMeetingState(meeting.id, ['stopped', 'completed']);
|
||||
// Evidence
|
||||
@@ -599,14 +622,18 @@ describe('Lifecycle stress tests', () => {
|
||||
name: 'Start recording after delete does not reuse deleted meeting',
|
||||
async run() {
|
||||
await ensureNoActiveRecordings();
|
||||
const meeting = await createMeeting(`Lifecycle delete-restart ${TestData.generateTestId()}`);
|
||||
const meeting = await createMeeting(
|
||||
`Lifecycle delete-restart ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(meeting.id);
|
||||
await deleteMeeting(meeting.id);
|
||||
const meetings = await listMeetings();
|
||||
if (meetings.some((item) => item.id === meeting.id)) {
|
||||
throw new Error('Deleted meeting still appears in list');
|
||||
}
|
||||
const replacement = await createMeeting(`Lifecycle delete-restart new ${TestData.generateTestId()}`);
|
||||
const replacement = await createMeeting(
|
||||
`Lifecycle delete-restart new ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(replacement.id);
|
||||
await startTone(replacement.id, TONE);
|
||||
await executeInApp({ type: 'stopMeeting', meetingId: replacement.id });
|
||||
@@ -621,7 +648,9 @@ describe('Lifecycle stress tests', () => {
|
||||
await ensureNoActiveRecordings();
|
||||
const meetingIds: string[] = [];
|
||||
for (let i = 0; i < 3; i += 1) {
|
||||
const meeting = await createMeeting(`Lifecycle rapid chain ${TestData.generateTestId()}`);
|
||||
const meeting = await createMeeting(
|
||||
`Lifecycle rapid chain ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(meeting.id);
|
||||
meetingIds.push(meeting.id);
|
||||
await startTone(meeting.id, { ...TONE, seconds: 1 });
|
||||
@@ -636,7 +665,9 @@ describe('Lifecycle stress tests', () => {
|
||||
name: 'Stop recording while injecting tone continues gracefully',
|
||||
async run() {
|
||||
await ensureNoActiveRecordings();
|
||||
const meeting = await createMeeting(`Lifecycle stop-during-inject ${TestData.generateTestId()}`);
|
||||
const meeting = await createMeeting(
|
||||
`Lifecycle stop-during-inject ${TestData.generateTestId()}`
|
||||
);
|
||||
createdMeetingIds.add(meeting.id);
|
||||
void startTone(meeting.id, { ...TONE, seconds: 2 }, { waitForRecording: false });
|
||||
await waitForMeetingState(meeting.id, ['recording']);
|
||||
@@ -697,7 +728,9 @@ describe('Lifecycle stress tests', () => {
|
||||
throw new Error('Meeting state did not transition out of recording');
|
||||
}
|
||||
// Evidence
|
||||
console.log(`[e2e-lifecycle] badge-transition: meeting=${meeting.id} state=${stopped.state}`);
|
||||
console.log(
|
||||
`[e2e-lifecycle] badge-transition: meeting=${meeting.id} state=${stopped.state}`
|
||||
);
|
||||
},
|
||||
},
|
||||
];
|
||||
|
||||
@@ -28,7 +28,11 @@ describe('Meeting Operations', () => {
|
||||
|
||||
describe('listMeetings', () => {
|
||||
it('should list meetings with default parameters', async () => {
|
||||
const result = await executeInApp<{ meetings?: unknown[]; total_count?: number; error?: string }>({
|
||||
const result = await executeInApp<{
|
||||
meetings?: unknown[];
|
||||
total_count?: number;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'listMeetings',
|
||||
limit: 10,
|
||||
});
|
||||
@@ -80,12 +84,16 @@ describe('Meeting Operations', () => {
|
||||
it('should create a new meeting', async () => {
|
||||
const title = TestData.createMeetingTitle();
|
||||
|
||||
const result = await executeInApp<{ id?: string; title?: string; state?: string; created_at?: number; error?: string }>(
|
||||
{
|
||||
type: 'createMeeting',
|
||||
title,
|
||||
}
|
||||
);
|
||||
const result = await executeInApp<{
|
||||
id?: string;
|
||||
title?: string;
|
||||
state?: string;
|
||||
created_at?: number;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'createMeeting',
|
||||
title,
|
||||
});
|
||||
|
||||
if (!result?.error && result?.id) {
|
||||
expect(result).toHaveProperty('id');
|
||||
@@ -103,7 +111,11 @@ describe('Meeting Operations', () => {
|
||||
const title = TestData.createMeetingTitle();
|
||||
const metadata = { test_key: 'test_value', source: 'e2e-native' };
|
||||
|
||||
const result = await executeInApp<{ id?: string; metadata?: Record<string, unknown>; error?: string }>({
|
||||
const result = await executeInApp<{
|
||||
id?: string;
|
||||
metadata?: Record<string, unknown>;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'createMeeting',
|
||||
title,
|
||||
metadata,
|
||||
|
||||
@@ -43,7 +43,6 @@ describe('Audio Devices', () => {
|
||||
return;
|
||||
}
|
||||
expect(result.success).toBe(true);
|
||||
|
||||
});
|
||||
});
|
||||
|
||||
@@ -105,10 +104,12 @@ describe('Recording Operations', () => {
|
||||
testMeetingId = meeting.id;
|
||||
|
||||
// Start transcription
|
||||
const result = await executeInApp<{ success?: boolean; hasStream?: boolean; error?: string }>({
|
||||
type: 'startTranscription',
|
||||
meetingId: meeting.id,
|
||||
});
|
||||
const result = await executeInApp<{ success?: boolean; hasStream?: boolean; error?: string }>(
|
||||
{
|
||||
type: 'startTranscription',
|
||||
meetingId: meeting.id,
|
||||
}
|
||||
);
|
||||
|
||||
// May fail if no audio device available
|
||||
expect(result).toBeDefined();
|
||||
@@ -129,7 +130,11 @@ describe('Playback Operations', () => {
|
||||
|
||||
describe('getPlaybackState', () => {
|
||||
it('should return playback state', async () => {
|
||||
const result = await executeInApp<{ success?: boolean; state?: Record<string, unknown>; error?: string }>({
|
||||
const result = await executeInApp<{
|
||||
success?: boolean;
|
||||
state?: Record<string, unknown>;
|
||||
error?: string;
|
||||
}>({
|
||||
type: 'getPlaybackState',
|
||||
});
|
||||
|
||||
@@ -153,7 +158,9 @@ describe('Playback Operations', () => {
|
||||
return;
|
||||
}
|
||||
|
||||
const result = await executeInApp<{ success?: boolean; error?: string }>({ type: 'pausePlayback' });
|
||||
const result = await executeInApp<{ success?: boolean; error?: string }>({
|
||||
type: 'pausePlayback',
|
||||
});
|
||||
expect(result).toBeDefined();
|
||||
});
|
||||
|
||||
@@ -169,7 +176,9 @@ describe('Playback Operations', () => {
|
||||
return;
|
||||
}
|
||||
|
||||
const result = await executeInApp<{ success?: boolean; error?: string }>({ type: 'stopPlayback' });
|
||||
const result = await executeInApp<{ success?: boolean; error?: string }>({
|
||||
type: 'stopPlayback',
|
||||
});
|
||||
expect(result).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
@@ -101,13 +101,7 @@ describe('Round-trip flow', () => {
|
||||
throw new Error('Meeting ID missing');
|
||||
}
|
||||
|
||||
const wavPath = path.resolve(
|
||||
process.cwd(),
|
||||
'..',
|
||||
'tests',
|
||||
'fixtures',
|
||||
'sample_discord.wav'
|
||||
);
|
||||
const wavPath = path.resolve(process.cwd(), '..', 'tests', 'fixtures', 'sample_discord.wav');
|
||||
|
||||
const startResult = await executeInApp({
|
||||
type: 'startTranscriptionWithInjection',
|
||||
@@ -121,9 +115,10 @@ describe('Round-trip flow', () => {
|
||||
throw new Error(`Recording/injection failed: ${startResult.error ?? 'unknown error'}`);
|
||||
}
|
||||
|
||||
const injectResult = startResult.inject as
|
||||
| { chunksSent?: number; durationSeconds?: number }
|
||||
| null;
|
||||
const injectResult = startResult.inject as {
|
||||
chunksSent?: number;
|
||||
durationSeconds?: number;
|
||||
} | null;
|
||||
if (!injectResult || (injectResult.chunksSent ?? 0) <= 0) {
|
||||
console.log('[e2e] injection_debug', startResult.debug ?? null);
|
||||
throw new Error('Audio injection did not send any chunks');
|
||||
|
||||
@@ -128,17 +128,14 @@ export async function waitForNetworkIdle(page: Page, timeout = 5000): Promise<vo
|
||||
*/
|
||||
export async function executeWithAPI<T>(page: Page, fn: (api: unknown) => Promise<T>): Promise<T> {
|
||||
await waitForAPI(page, E2E_TIMEOUTS.PAGE_LOAD_MS);
|
||||
return page.evaluate(
|
||||
async (fnInPage) => {
|
||||
// Access API through window.__NOTEFLOW_API__ which we expose for testing
|
||||
const api = window.__NOTEFLOW_API__;
|
||||
if (!api) {
|
||||
throw new Error('API not exposed on window. Ensure test mode is enabled.');
|
||||
}
|
||||
return fnInPage(api);
|
||||
},
|
||||
fn
|
||||
);
|
||||
return page.evaluate(async (fnInPage) => {
|
||||
// Access API through window.__NOTEFLOW_API__ which we expose for testing
|
||||
const api = window.__NOTEFLOW_API__;
|
||||
if (!api) {
|
||||
throw new Error('API not exposed on window. Ensure test mode is enabled.');
|
||||
}
|
||||
return fnInPage(api);
|
||||
}, fn);
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -202,11 +202,7 @@ test.describe('oidc provider api integration', () => {
|
||||
const created = await callAPI<OidcProvider>(page, 'registerOidcProvider', testData);
|
||||
|
||||
// Test connection (may fail for mock issuer, but API call should succeed)
|
||||
const result = await callAPI<RefreshDiscoveryResult>(
|
||||
page,
|
||||
'testOidcConnection',
|
||||
created.id
|
||||
);
|
||||
const result = await callAPI<RefreshDiscoveryResult>(page, 'testOidcConnection', created.id);
|
||||
|
||||
expect(result).toHaveProperty('results');
|
||||
expect(result).toHaveProperty('success_count');
|
||||
|
||||
@@ -21,7 +21,9 @@ import {
|
||||
|
||||
const shouldRun = process.env.NOTEFLOW_E2E === '1';
|
||||
const meetingDetailPath = (meeting: { id: string; project_id?: string }) =>
|
||||
meeting.project_id ? `/projects/${meeting.project_id}/meetings/${meeting.id}` : `/meetings/${meeting.id}`;
|
||||
meeting.project_id
|
||||
? `/projects/${meeting.project_id}/meetings/${meeting.id}`
|
||||
: `/meetings/${meeting.id}`;
|
||||
|
||||
test.describe('post-processing pipeline', () => {
|
||||
test.skip(!shouldRun, 'Set NOTEFLOW_E2E=1 to enable end-to-end tests.');
|
||||
@@ -81,7 +83,7 @@ test.describe('post-processing pipeline', () => {
|
||||
page,
|
||||
'createMeeting',
|
||||
{
|
||||
title: 'E2E Meeting Detail Test',
|
||||
title: 'E2E Meeting Detail Test',
|
||||
}
|
||||
);
|
||||
|
||||
@@ -90,7 +92,9 @@ test.describe('post-processing pipeline', () => {
|
||||
await waitForLoadingComplete(page);
|
||||
|
||||
// Verify meeting title is displayed
|
||||
const titleElement = page.locator(`h1:has-text("E2E Meeting Detail Test"), h2:has-text("E2E Meeting Detail Test"), [data-testid="meeting-title"]`);
|
||||
const titleElement = page.locator(
|
||||
`h1:has-text("E2E Meeting Detail Test"), h2:has-text("E2E Meeting Detail Test"), [data-testid="meeting-title"]`
|
||||
);
|
||||
await expect(titleElement.first()).toBeVisible({ timeout: 10000 });
|
||||
});
|
||||
|
||||
@@ -246,7 +250,10 @@ test.describe('post-processing pipeline', () => {
|
||||
|
||||
// Click on the meeting card or link
|
||||
const meetingLink = page.locator(`a[href*="${meeting.id}"], [data-testid="meeting-card"]`);
|
||||
const isClickable = await meetingLink.first().isVisible().catch(() => false);
|
||||
const isClickable = await meetingLink
|
||||
.first()
|
||||
.isVisible()
|
||||
.catch(() => false);
|
||||
|
||||
if (isClickable) {
|
||||
await meetingLink.first().click();
|
||||
|
||||
@@ -6,13 +6,7 @@
|
||||
*/
|
||||
|
||||
import { expect, test } from '@playwright/test';
|
||||
import {
|
||||
callAPI,
|
||||
E2E_TIMEOUTS,
|
||||
navigateTo,
|
||||
waitForAPI,
|
||||
waitForLoadingComplete,
|
||||
} from './fixtures';
|
||||
import { callAPI, E2E_TIMEOUTS, navigateTo, waitForAPI, waitForLoadingComplete } from './fixtures';
|
||||
|
||||
const shouldRun = process.env.NOTEFLOW_E2E === '1';
|
||||
|
||||
@@ -133,7 +127,10 @@ test.describe('Audio Devices Section', () => {
|
||||
const detectBtn = audioCard.locator(
|
||||
'button:has-text("Detect"), button:has-text("Grant"), button:has-text("Refresh")'
|
||||
);
|
||||
const detectVisible = await detectBtn.first().isVisible().catch(() => false);
|
||||
const detectVisible = await detectBtn
|
||||
.first()
|
||||
.isVisible()
|
||||
.catch(() => false);
|
||||
expect(detectVisible).toBe(true);
|
||||
});
|
||||
|
||||
|
||||
@@ -172,6 +172,10 @@ impl DriftDetector {
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
const BASE_BUFFER_LEN: usize = 1000;
|
||||
const EXTREME_BUFFER_LEN: usize = 10_000;
|
||||
const EXTREME_DRIFT_MULTIPLIER: usize = 10;
|
||||
|
||||
#[test]
|
||||
fn test_drift_detector_new() {
|
||||
let detector = DriftDetector::new();
|
||||
@@ -186,8 +190,8 @@ mod tests {
|
||||
// Simulate primary buffer growing faster than secondary
|
||||
// This indicates primary source is producing more samples (faster clock)
|
||||
for i in 0..200 {
|
||||
let primary_len = 1000 + i * 2; // Growing faster
|
||||
let secondary_len = 1000 + i; // Growing slower
|
||||
let primary_len = BASE_BUFFER_LEN + i * 2; // Growing faster
|
||||
let secondary_len = BASE_BUFFER_LEN + i; // Growing slower
|
||||
detector.update(primary_len, secondary_len);
|
||||
}
|
||||
|
||||
@@ -205,8 +209,8 @@ mod tests {
|
||||
|
||||
// Simulate secondary buffer growing faster than primary
|
||||
for i in 0..200 {
|
||||
let primary_len = 1000 + i;
|
||||
let secondary_len = 1000 + i * 2;
|
||||
let primary_len = BASE_BUFFER_LEN + i;
|
||||
let secondary_len = BASE_BUFFER_LEN + i * 2;
|
||||
detector.update(primary_len, secondary_len);
|
||||
}
|
||||
|
||||
@@ -222,7 +226,7 @@ mod tests {
|
||||
|
||||
// Both buffers at same level
|
||||
for _ in 0..100 {
|
||||
detector.update(1000, 1000);
|
||||
detector.update(BASE_BUFFER_LEN, BASE_BUFFER_LEN);
|
||||
}
|
||||
|
||||
assert!(
|
||||
@@ -237,7 +241,7 @@ mod tests {
|
||||
|
||||
// Build up some state
|
||||
for i in 0..50 {
|
||||
detector.update(1000 + i, 1000);
|
||||
detector.update(BASE_BUFFER_LEN + i, BASE_BUFFER_LEN);
|
||||
}
|
||||
|
||||
detector.reset();
|
||||
@@ -254,8 +258,8 @@ mod tests {
|
||||
let mut ratio_received = false;
|
||||
for i in 0..10000 {
|
||||
// Extreme drift to trigger ratio calculation
|
||||
let primary_len = 10000 + i * 10;
|
||||
let secondary_len = 10000;
|
||||
let primary_len = EXTREME_BUFFER_LEN + i * EXTREME_DRIFT_MULTIPLIER;
|
||||
let secondary_len = EXTREME_BUFFER_LEN;
|
||||
if let Some(ratio) = detector.update(primary_len, secondary_len) {
|
||||
ratio_received = true;
|
||||
// Ratio should be close to 1.0 but not exactly 1.0
|
||||
|
||||
@@ -186,6 +186,13 @@ impl DriftMetrics {
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
const TEST_DRIFT_PPM: f64 = 50.0;
|
||||
const TEST_RATIO: f64 = 1.00005;
|
||||
const TEST_DRIFT_PPM_ALT: f64 = 25.0;
|
||||
const TEST_RATIO_ALT: f64 = 1.000025;
|
||||
const TEST_DRIFT_PPM_UPDATE: f64 = 30.0;
|
||||
const TEST_RATIO_UPDATE: f64 = 1.00003;
|
||||
|
||||
#[test]
|
||||
fn test_drift_metrics_new() {
|
||||
let metrics = DriftMetrics::new();
|
||||
@@ -205,22 +212,22 @@ mod tests {
|
||||
#[test]
|
||||
fn test_record_adjustment() {
|
||||
let mut metrics = DriftMetrics::new();
|
||||
metrics.record_adjustment(50.0, 1.00005);
|
||||
metrics.record_adjustment(TEST_DRIFT_PPM, TEST_RATIO);
|
||||
assert_eq!(metrics.adjustment_count(), 1);
|
||||
assert_eq!(metrics.last_drift_ppm(), 50.0);
|
||||
assert!((metrics.last_ratio() - 1.00005).abs() < f64::EPSILON);
|
||||
assert_eq!(metrics.last_drift_ppm(), TEST_DRIFT_PPM);
|
||||
assert!((metrics.last_ratio() - TEST_RATIO).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_snapshot() {
|
||||
let mut metrics = DriftMetrics::new();
|
||||
metrics.record_overflow();
|
||||
metrics.record_adjustment(25.0, 1.000025);
|
||||
metrics.record_adjustment(TEST_DRIFT_PPM_ALT, TEST_RATIO_ALT);
|
||||
|
||||
let snapshot = metrics.snapshot();
|
||||
assert_eq!(snapshot.overflow_count, 1);
|
||||
assert_eq!(snapshot.adjustment_count, 1);
|
||||
assert_eq!(snapshot.drift_ppm, 25.0);
|
||||
assert_eq!(snapshot.drift_ppm, TEST_DRIFT_PPM_ALT);
|
||||
assert!(snapshot.enabled);
|
||||
}
|
||||
|
||||
@@ -240,7 +247,7 @@ mod tests {
|
||||
fn test_reset() {
|
||||
let mut metrics = DriftMetrics::new();
|
||||
metrics.record_overflow();
|
||||
metrics.record_adjustment(50.0, 1.00005);
|
||||
metrics.record_adjustment(TEST_DRIFT_PPM, TEST_RATIO);
|
||||
metrics.set_enabled(false);
|
||||
|
||||
metrics.reset();
|
||||
@@ -263,10 +270,10 @@ mod tests {
|
||||
#[test]
|
||||
fn test_update_values() {
|
||||
let mut metrics = DriftMetrics::new();
|
||||
metrics.update_values(30.0, 1.00003);
|
||||
metrics.update_values(TEST_DRIFT_PPM_UPDATE, TEST_RATIO_UPDATE);
|
||||
|
||||
assert_eq!(metrics.last_drift_ppm(), 30.0);
|
||||
assert!((metrics.last_ratio() - 1.00003).abs() < f64::EPSILON);
|
||||
assert_eq!(metrics.last_drift_ppm(), TEST_DRIFT_PPM_UPDATE);
|
||||
assert!((metrics.last_ratio() - TEST_RATIO_UPDATE).abs() < f64::EPSILON);
|
||||
// Should not increment adjustment count
|
||||
assert_eq!(metrics.adjustment_count(), 0);
|
||||
}
|
||||
|
||||
@@ -6,6 +6,10 @@
|
||||
use crate::constants::drift_compensation::{RATIO_BYPASS_THRESHOLD, RATIO_SLEW_RATE};
|
||||
use rubato::{Resampler, SincFixedIn, SincInterpolationParameters, SincInterpolationType, WindowFunction};
|
||||
|
||||
const DEFAULT_CHUNK_SIZE: usize = 1024;
|
||||
const RESAMPLER_SINC_LEN: usize = 256;
|
||||
const RESAMPLER_OVERSAMPLING_FACTOR: usize = 256;
|
||||
|
||||
/// Adaptive resampler for drift compensation.
|
||||
///
|
||||
/// Uses rubato's sinc interpolation for high-quality resampling with
|
||||
@@ -45,7 +49,7 @@ impl AdaptiveResampler {
|
||||
/// * `sample_rate` - Audio sample rate in Hz
|
||||
/// * `channels` - Number of audio channels (1 = mono, 2 = stereo)
|
||||
pub fn new(sample_rate: u32, channels: usize) -> Self {
|
||||
let chunk_size = 1024; // Default chunk size for processing
|
||||
let chunk_size = DEFAULT_CHUNK_SIZE; // Default chunk size for processing
|
||||
|
||||
let mut resampler = Self {
|
||||
resampler: None,
|
||||
@@ -111,7 +115,10 @@ impl AdaptiveResampler {
|
||||
let deinterleaved = Self::deinterleave_static(samples, self.channels);
|
||||
|
||||
// Now get mutable reference to resampler
|
||||
let resampler = self.resampler.as_mut().unwrap();
|
||||
let resampler = match self.resampler.as_mut() {
|
||||
Some(resampler) => resampler,
|
||||
None => return samples.to_vec(),
|
||||
};
|
||||
|
||||
// Process through rubato - returns Vec<Vec<f32>>
|
||||
match resampler.process(&deinterleaved, None) {
|
||||
@@ -167,10 +174,10 @@ impl AdaptiveResampler {
|
||||
}
|
||||
|
||||
let params = SincInterpolationParameters {
|
||||
sinc_len: 256,
|
||||
sinc_len: RESAMPLER_SINC_LEN,
|
||||
f_cutoff: 0.95,
|
||||
interpolation: SincInterpolationType::Linear,
|
||||
oversampling_factor: 256,
|
||||
oversampling_factor: RESAMPLER_OVERSAMPLING_FACTOR,
|
||||
window: WindowFunction::BlackmanHarris2,
|
||||
};
|
||||
|
||||
@@ -270,6 +277,8 @@ mod tests {
|
||||
|
||||
const TEST_SAMPLE_RATE: u32 = 48000;
|
||||
const TEST_CHANNELS: usize = 2;
|
||||
const TEST_TARGET_RATIO: f64 = 1.001;
|
||||
const TEST_INPUT_SAMPLES: usize = 2048;
|
||||
|
||||
#[test]
|
||||
fn test_adaptive_resampler_new() {
|
||||
@@ -294,8 +303,8 @@ mod tests {
|
||||
fn test_adaptive_resampler_set_ratio() {
|
||||
let mut resampler = AdaptiveResampler::new(TEST_SAMPLE_RATE, TEST_CHANNELS);
|
||||
|
||||
resampler.set_target_ratio(1.001);
|
||||
assert_eq!(resampler.target_ratio(), 1.001);
|
||||
resampler.set_target_ratio(TEST_TARGET_RATIO);
|
||||
assert_eq!(resampler.target_ratio(), TEST_TARGET_RATIO);
|
||||
|
||||
// Current ratio should still be 1.0 until process is called
|
||||
assert_eq!(resampler.current_ratio(), 1.0);
|
||||
@@ -305,10 +314,10 @@ mod tests {
|
||||
fn test_adaptive_resampler_slew_limiting() {
|
||||
let mut resampler = AdaptiveResampler::new(TEST_SAMPLE_RATE, TEST_CHANNELS);
|
||||
|
||||
resampler.set_target_ratio(1.001);
|
||||
resampler.set_target_ratio(TEST_TARGET_RATIO);
|
||||
|
||||
// Process some samples to trigger ratio update
|
||||
let input: Vec<f32> = vec![0.0; 2048];
|
||||
let input: Vec<f32> = vec![0.0; TEST_INPUT_SAMPLES];
|
||||
resampler.process(&input);
|
||||
|
||||
// Ratio should have moved towards target but not jumped there instantly
|
||||
@@ -321,7 +330,7 @@ mod tests {
|
||||
fn test_adaptive_resampler_reset() {
|
||||
let mut resampler = AdaptiveResampler::new(TEST_SAMPLE_RATE, TEST_CHANNELS);
|
||||
|
||||
resampler.set_target_ratio(1.001);
|
||||
resampler.set_target_ratio(TEST_TARGET_RATIO);
|
||||
resampler.reset();
|
||||
|
||||
assert_eq!(resampler.current_ratio(), 1.0);
|
||||
|
||||
@@ -309,6 +309,10 @@ mod tests {
|
||||
const TEST_SAMPLE_RATE: u32 = 48_000;
|
||||
const TEST_CHANNELS: u16 = 1;
|
||||
const EPSILON: f32 = 0.001;
|
||||
const TEST_BUFFER_FILL_SAMPLES: usize = 1000;
|
||||
const TEST_DRAIN_SAMPLES: usize = 500;
|
||||
const TEST_PRIMARY_SAMPLE: f32 = 0.5;
|
||||
const TEST_SECONDARY_SAMPLE: f32 = 0.3;
|
||||
|
||||
#[test]
|
||||
fn test_mixer_basic() {
|
||||
@@ -407,11 +411,11 @@ mod tests {
|
||||
let mixer = AudioMixer::new(TEST_SAMPLE_RATE, TEST_CHANNELS, 1.0, 1.0);
|
||||
|
||||
// Push some data
|
||||
mixer.push_primary(&[0.5; 1000]);
|
||||
mixer.push_secondary(&[0.3; 1000]);
|
||||
mixer.push_primary(&[TEST_PRIMARY_SAMPLE; TEST_BUFFER_FILL_SAMPLES]);
|
||||
mixer.push_secondary(&[TEST_SECONDARY_SAMPLE; TEST_BUFFER_FILL_SAMPLES]);
|
||||
|
||||
// Drain to trigger drift detection
|
||||
mixer.drain_mixed(500);
|
||||
mixer.drain_mixed(TEST_DRAIN_SAMPLES);
|
||||
|
||||
// Clear should reset everything
|
||||
mixer.clear();
|
||||
|
||||
@@ -74,7 +74,10 @@ const App = () => (
|
||||
<Route element={<AppLayout />}>
|
||||
<Route path="/" element={<HomePage />} />
|
||||
<Route path="/projects" element={<ProjectsPage />} />
|
||||
<Route path="/projects/:projectId/settings" element={<ProjectSettingsPage />} />
|
||||
<Route
|
||||
path="/projects/:projectId/settings"
|
||||
element={<ProjectSettingsPage />}
|
||||
/>
|
||||
<Route path="/projects/:projectId/meetings" element={<MeetingsPage />} />
|
||||
<Route
|
||||
path="/projects/:projectId/meetings/:id"
|
||||
|
||||
@@ -2,10 +2,7 @@ import type { NoteFlowAPI } from '../interface';
|
||||
import type { ListInstalledAppsRequest, ListInstalledAppsResponse } from '../types';
|
||||
import { rejectReadOnly } from './readonly';
|
||||
|
||||
export const cachedAppsAPI: Pick<
|
||||
NoteFlowAPI,
|
||||
'listInstalledApps' | 'invalidateAppCache'
|
||||
> = {
|
||||
export const cachedAppsAPI: Pick<NoteFlowAPI, 'listInstalledApps' | 'invalidateAppCache'> = {
|
||||
async listInstalledApps(_options?: ListInstalledAppsRequest): Promise<ListInstalledAppsResponse> {
|
||||
return {
|
||||
apps: [],
|
||||
|
||||
@@ -1,8 +1,5 @@
|
||||
import type { NoteFlowAPI } from '../interface';
|
||||
import type {
|
||||
StreamingConfiguration,
|
||||
UpdateStreamingConfigurationRequest,
|
||||
} from '../types';
|
||||
import type { StreamingConfiguration, UpdateStreamingConfigurationRequest } from '../types';
|
||||
import { rejectReadOnly } from './readonly';
|
||||
|
||||
const offlineStreamingConfiguration: StreamingConfiguration = {
|
||||
|
||||
@@ -64,7 +64,10 @@ function extractGrpcStatusCodeFromMessage(message: string): number | undefined {
|
||||
if (!match?.[1]) {
|
||||
return undefined;
|
||||
}
|
||||
const normalized = match[1].replace(/([a-z0-9])([A-Z])/g, '$1_$2').replace(/__/g, '_').toUpperCase();
|
||||
const normalized = match[1]
|
||||
.replace(/([a-z0-9])([A-Z])/g, '$1_$2')
|
||||
.replace(/__/g, '_')
|
||||
.toUpperCase();
|
||||
const code = GRPC_STATUS_CODES[normalized as keyof typeof GRPC_STATUS_CODES];
|
||||
return typeof code === 'number' ? code : undefined;
|
||||
}
|
||||
|
||||
@@ -164,7 +164,10 @@ if (typeof window !== 'undefined') {
|
||||
const current = preferences.get();
|
||||
preferences.replace({ ...current, ...updates });
|
||||
},
|
||||
forceConnectionState: (mode: 'connected' | 'disconnected' | 'cached' | 'mock', serverUrl?: string | null) => {
|
||||
forceConnectionState: (
|
||||
mode: 'connected' | 'disconnected' | 'cached' | 'mock',
|
||||
serverUrl?: string | null
|
||||
) => {
|
||||
setConnectionMode(mode);
|
||||
setConnectionServerUrl(serverUrl ?? null);
|
||||
},
|
||||
|
||||
@@ -443,9 +443,7 @@ export interface NoteFlowAPI {
|
||||
* Set a HuggingFace token with optional validation
|
||||
* @see gRPC endpoint: SetHuggingFaceToken (unary)
|
||||
*/
|
||||
setHuggingFaceToken(
|
||||
request: SetHuggingFaceTokenRequest
|
||||
): Promise<SetHuggingFaceTokenResult>;
|
||||
setHuggingFaceToken(request: SetHuggingFaceTokenRequest): Promise<SetHuggingFaceTokenResult>;
|
||||
|
||||
/**
|
||||
* Get the status of the configured HuggingFace token
|
||||
|
||||
@@ -1737,8 +1737,7 @@ export const mockAPI: NoteFlowAPI = {
|
||||
name: request.name ?? provider.name,
|
||||
scopes: requestedScopes.length > 0 ? requestedScopes : provider.scopes,
|
||||
claim_mapping: request.claim_mapping ?? provider.claim_mapping,
|
||||
allowed_groups:
|
||||
requestedGroups.length > 0 ? requestedGroups : provider.allowed_groups,
|
||||
allowed_groups: requestedGroups.length > 0 ? requestedGroups : provider.allowed_groups,
|
||||
require_email_verified: request.require_email_verified ?? provider.require_email_verified,
|
||||
enabled: request.enabled ?? provider.enabled,
|
||||
updated_at: Date.now(),
|
||||
|
||||
@@ -309,24 +309,21 @@ export class TauriTranscriptionStream implements TranscriptionStream {
|
||||
this.unlistenFn = null;
|
||||
}
|
||||
|
||||
const unlisten = await this.listen<TranscriptUpdate>(
|
||||
TauriEvents.TRANSCRIPT_UPDATE,
|
||||
(event) => {
|
||||
if (this.isClosed) {
|
||||
return;
|
||||
}
|
||||
if (event.payload.meeting_id === this.meetingId) {
|
||||
// Track latest ack_sequence for monitoring
|
||||
if (
|
||||
typeof event.payload.ack_sequence === 'number' &&
|
||||
event.payload.ack_sequence > this.lastAckedSequence
|
||||
) {
|
||||
this.lastAckedSequence = event.payload.ack_sequence;
|
||||
}
|
||||
callback(event.payload);
|
||||
}
|
||||
const unlisten = await this.listen<TranscriptUpdate>(TauriEvents.TRANSCRIPT_UPDATE, (event) => {
|
||||
if (this.isClosed) {
|
||||
return;
|
||||
}
|
||||
);
|
||||
if (event.payload.meeting_id === this.meetingId) {
|
||||
// Track latest ack_sequence for monitoring
|
||||
if (
|
||||
typeof event.payload.ack_sequence === 'number' &&
|
||||
event.payload.ack_sequence > this.lastAckedSequence
|
||||
) {
|
||||
this.lastAckedSequence = event.payload.ack_sequence;
|
||||
}
|
||||
callback(event.payload);
|
||||
}
|
||||
});
|
||||
if (this.isClosed) {
|
||||
unlisten();
|
||||
return;
|
||||
@@ -788,13 +785,16 @@ export function createTauriAPI(invoke: TauriInvoke, listen: TauriListen): NoteFl
|
||||
async listSummarizationTemplates(
|
||||
request: ListSummarizationTemplatesRequest
|
||||
): Promise<ListSummarizationTemplatesResponse> {
|
||||
return invoke<ListSummarizationTemplatesResponse>(TauriCommands.LIST_SUMMARIZATION_TEMPLATES, {
|
||||
workspace_id: request.workspace_id,
|
||||
include_system: request.include_system ?? true,
|
||||
include_archived: request.include_archived ?? false,
|
||||
limit: request.limit,
|
||||
offset: request.offset,
|
||||
});
|
||||
return invoke<ListSummarizationTemplatesResponse>(
|
||||
TauriCommands.LIST_SUMMARIZATION_TEMPLATES,
|
||||
{
|
||||
workspace_id: request.workspace_id,
|
||||
include_system: request.include_system ?? true,
|
||||
include_archived: request.include_archived ?? false,
|
||||
limit: request.limit,
|
||||
offset: request.offset,
|
||||
}
|
||||
);
|
||||
},
|
||||
|
||||
async getSummarizationTemplate(
|
||||
@@ -992,11 +992,7 @@ export function createTauriAPI(invoke: TauriInvoke, listen: TauriListen): NoteFl
|
||||
clientLog.exportCompleted(meetingId, format);
|
||||
return result;
|
||||
} catch (error) {
|
||||
clientLog.exportFailed(
|
||||
meetingId,
|
||||
format,
|
||||
extractErrorMessage(error, 'Export failed')
|
||||
);
|
||||
clientLog.exportFailed(meetingId, format, extractErrorMessage(error, 'Export failed'));
|
||||
throw error;
|
||||
}
|
||||
},
|
||||
@@ -1108,7 +1104,10 @@ export function createTauriAPI(invoke: TauriInvoke, listen: TauriListen): NoteFl
|
||||
await invoke(TauriCommands.SET_DUAL_CAPTURE_ENABLED, { enabled });
|
||||
},
|
||||
async setAudioMixLevels(micGain: number, systemGain: number): Promise<void> {
|
||||
await invoke(TauriCommands.SET_AUDIO_MIX_LEVELS, { mic_gain: micGain, system_gain: systemGain });
|
||||
await invoke(TauriCommands.SET_AUDIO_MIX_LEVELS, {
|
||||
mic_gain: micGain,
|
||||
system_gain: systemGain,
|
||||
});
|
||||
},
|
||||
async getDualCaptureConfig(): Promise<DualCaptureConfigInfo> {
|
||||
return invoke<DualCaptureConfigInfo>(TauriCommands.GET_DUAL_CAPTURE_CONFIG);
|
||||
@@ -1172,7 +1171,9 @@ export function createTauriAPI(invoke: TauriInvoke, listen: TauriListen): NoteFl
|
||||
};
|
||||
},
|
||||
|
||||
async listInstalledApps(options?: ListInstalledAppsRequest): Promise<ListInstalledAppsResponse> {
|
||||
async listInstalledApps(
|
||||
options?: ListInstalledAppsRequest
|
||||
): Promise<ListInstalledAppsResponse> {
|
||||
return invoke<ListInstalledAppsResponse>(TauriCommands.LIST_INSTALLED_APPS, {
|
||||
common_only: options?.commonOnly ?? false,
|
||||
page: options?.page ?? 0,
|
||||
@@ -1278,13 +1279,17 @@ export function createTauriAPI(invoke: TauriInvoke, listen: TauriListen): NoteFl
|
||||
});
|
||||
},
|
||||
async disconnectCalendar(provider: string): Promise<DisconnectOAuthResponse> {
|
||||
const response = await invoke<DisconnectOAuthResponse>(TauriCommands.DISCONNECT_OAUTH, { provider });
|
||||
const response = await invoke<DisconnectOAuthResponse>(TauriCommands.DISCONNECT_OAUTH, {
|
||||
provider,
|
||||
});
|
||||
clientLog.calendarDisconnected(provider);
|
||||
return response;
|
||||
},
|
||||
|
||||
async registerWebhook(r: RegisterWebhookRequest): Promise<RegisteredWebhook> {
|
||||
const webhook = await invoke<RegisteredWebhook>(TauriCommands.REGISTER_WEBHOOK, { request: r });
|
||||
const webhook = await invoke<RegisteredWebhook>(TauriCommands.REGISTER_WEBHOOK, {
|
||||
request: r,
|
||||
});
|
||||
clientLog.webhookRegistered(webhook.id, webhook.name);
|
||||
return webhook;
|
||||
},
|
||||
@@ -1297,7 +1302,9 @@ export function createTauriAPI(invoke: TauriInvoke, listen: TauriListen): NoteFl
|
||||
return invoke<RegisteredWebhook>(TauriCommands.UPDATE_WEBHOOK, { request: r });
|
||||
},
|
||||
async deleteWebhook(webhookId: string): Promise<DeleteWebhookResponse> {
|
||||
const response = await invoke<DeleteWebhookResponse>(TauriCommands.DELETE_WEBHOOK, { webhook_id: webhookId });
|
||||
const response = await invoke<DeleteWebhookResponse>(TauriCommands.DELETE_WEBHOOK, {
|
||||
webhook_id: webhookId,
|
||||
});
|
||||
clientLog.webhookDeleted(webhookId);
|
||||
return response;
|
||||
},
|
||||
@@ -1420,11 +1427,7 @@ export function isTauriEnvironment(): boolean {
|
||||
}
|
||||
// Tauri 2.x injects __TAURI_INTERNALS__ into the window
|
||||
// Only check for Tauri-injected globals, not our own globals like __NOTEFLOW_API__
|
||||
return (
|
||||
'__TAURI_INTERNALS__' in window ||
|
||||
'__TAURI__' in window ||
|
||||
'isTauri' in window
|
||||
);
|
||||
return '__TAURI_INTERNALS__' in window || '__TAURI__' in window || 'isTauri' in window;
|
||||
}
|
||||
|
||||
/** Dynamically import Tauri APIs and create the adapter. */
|
||||
|
||||
@@ -104,9 +104,10 @@ describe('TauriTranscriptionStream', () => {
|
||||
expect(errorCallback).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
// The StreamingQueue reports failures with its own message format
|
||||
const expectedError: Record<string, unknown> = {
|
||||
code: 'stream_send_failed',
|
||||
message: expect.stringContaining('Connection lost'),
|
||||
message: expect.stringContaining('consecutive failures'),
|
||||
};
|
||||
expect(errorCallback).toHaveBeenCalledWith(expectedError);
|
||||
});
|
||||
@@ -151,12 +152,12 @@ describe('TauriTranscriptionStream', () => {
|
||||
timestamp: 1,
|
||||
});
|
||||
|
||||
// StreamingQueue logs errors via the async-queue module
|
||||
await vi.waitFor(() => {
|
||||
expect(mockAddClientLog).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
level: 'error',
|
||||
source: 'api',
|
||||
message: 'Tauri stream send_audio_chunk failed',
|
||||
message: expect.stringContaining('operation failed'),
|
||||
})
|
||||
);
|
||||
});
|
||||
@@ -180,15 +181,16 @@ describe('TauriTranscriptionStream', () => {
|
||||
const failingStream = new TauriTranscriptionStream('meeting-123', failingInvoke, mockListen);
|
||||
failingStream.onError(errorCallback);
|
||||
|
||||
failingStream.close();
|
||||
|
||||
await vi.waitFor(() => {
|
||||
const expectedError: Record<string, unknown> = {
|
||||
code: 'stream_close_failed',
|
||||
message: expect.stringContaining('Failed to stop'),
|
||||
};
|
||||
expect(errorCallback).toHaveBeenCalledWith(expectedError);
|
||||
// close() re-throws errors, so we need to catch it
|
||||
await failingStream.close().catch(() => {
|
||||
// Expected to throw
|
||||
});
|
||||
|
||||
const expectedError: Record<string, unknown> = {
|
||||
code: 'stream_close_failed',
|
||||
message: expect.stringContaining('Failed to stop'),
|
||||
};
|
||||
expect(errorCallback).toHaveBeenCalledWith(expectedError);
|
||||
});
|
||||
|
||||
it('logs close errors to clientLog', async () => {
|
||||
@@ -197,17 +199,18 @@ describe('TauriTranscriptionStream', () => {
|
||||
const failingInvoke = vi.fn().mockRejectedValue(new Error('Stop failed'));
|
||||
const failingStream = new TauriTranscriptionStream('meeting-123', failingInvoke, mockListen);
|
||||
|
||||
failingStream.close();
|
||||
|
||||
await vi.waitFor(() => {
|
||||
expect(mockAddClientLog).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
level: 'error',
|
||||
source: 'api',
|
||||
message: 'Tauri stream stop_recording failed',
|
||||
})
|
||||
);
|
||||
// close() re-throws errors, so we need to catch it
|
||||
await failingStream.close().catch(() => {
|
||||
// Expected to throw
|
||||
});
|
||||
|
||||
expect(mockAddClientLog).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
level: 'error',
|
||||
source: 'api',
|
||||
message: 'Tauri stream stop_recording failed',
|
||||
})
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -29,13 +29,7 @@ export type ASRComputeType = 'unspecified' | 'int8' | 'float16' | 'float32';
|
||||
/**
|
||||
* Job status for background tasks
|
||||
*/
|
||||
export type JobStatus =
|
||||
| 'unspecified'
|
||||
| 'queued'
|
||||
| 'running'
|
||||
| 'completed'
|
||||
| 'failed'
|
||||
| 'cancelled';
|
||||
export type JobStatus = 'unspecified' | 'queued' | 'running' | 'completed' | 'failed' | 'cancelled';
|
||||
|
||||
/**
|
||||
* Current ASR configuration and capabilities
|
||||
@@ -97,12 +91,7 @@ export interface UpdateASRConfigurationResult {
|
||||
/**
|
||||
* ASR reconfiguration job phase
|
||||
*/
|
||||
export type ASRJobPhase =
|
||||
| 'validating'
|
||||
| 'downloading'
|
||||
| 'loading'
|
||||
| 'completed'
|
||||
| 'failed';
|
||||
export type ASRJobPhase = 'validating' | 'downloading' | 'loading' | 'completed' | 'failed';
|
||||
|
||||
/**
|
||||
* Status of an ASR reconfiguration job
|
||||
|
||||
@@ -10,6 +10,7 @@ import { Button } from '@/components/ui/button';
|
||||
import { Collapsible, CollapsibleContent, CollapsibleTrigger } from '@/components/ui/collapsible';
|
||||
import { formatRelativeTimeMs } from '@/lib/format';
|
||||
import { toFriendlyMessage } from '@/lib/log-messages';
|
||||
import { sanitizeLogMetadata } from '@/lib/log-sanitizer';
|
||||
import type { SummarizedLog } from '@/lib/log-summarizer';
|
||||
import { cn } from '@/lib/utils';
|
||||
import { levelConfig } from './log-entry-config';
|
||||
@@ -46,22 +47,24 @@ export interface LogEntryProps {
|
||||
}
|
||||
|
||||
export function LogEntry({ summarized, viewMode, isExpanded, onToggleExpanded }: LogEntryProps) {
|
||||
const {log} = summarized;
|
||||
const { log } = summarized;
|
||||
const sanitizedMetadata = log.metadata ? sanitizeLogMetadata(log.metadata) : undefined;
|
||||
const safeLog = sanitizedMetadata ? { ...log, metadata: sanitizedMetadata } : log;
|
||||
const config = levelConfig[log.level];
|
||||
const Icon = config.icon;
|
||||
const hasDetails = log.details || log.metadata || log.traceId || log.spanId;
|
||||
const hasDetails = safeLog.details || safeLog.metadata || safeLog.traceId || safeLog.spanId;
|
||||
|
||||
// Get display message based on view mode
|
||||
const displayMessage =
|
||||
viewMode === 'friendly'
|
||||
? toFriendlyMessage(log.message, (log.metadata as Record<string, string>) ?? {})
|
||||
: log.message;
|
||||
? toFriendlyMessage(safeLog.message, (safeLog.metadata as Record<string, string>) ?? {})
|
||||
: safeLog.message;
|
||||
|
||||
// Get display timestamp based on view mode
|
||||
const displayTimestamp =
|
||||
viewMode === 'friendly'
|
||||
? formatRelativeTimeMs(log.timestamp)
|
||||
: format(new Date(log.timestamp), 'HH:mm:ss.SSS');
|
||||
? formatRelativeTimeMs(safeLog.timestamp)
|
||||
: format(new Date(safeLog.timestamp), 'HH:mm:ss.SSS');
|
||||
|
||||
return (
|
||||
<Collapsible open={isExpanded} onOpenChange={onToggleExpanded}>
|
||||
@@ -91,9 +94,7 @@ export function LogEntry({ summarized, viewMode, isExpanded, onToggleExpanded }:
|
||||
<Badge variant="outline" className={cn('text-xs', sourceColors[log.source])}>
|
||||
{log.source}
|
||||
</Badge>
|
||||
<Badge variant="secondary">
|
||||
{log.origin}
|
||||
</Badge>
|
||||
<Badge variant="secondary">{log.origin}</Badge>
|
||||
</>
|
||||
)}
|
||||
{summarized.isGroup && summarized.count > 1 && (
|
||||
@@ -104,12 +105,19 @@ export function LogEntry({ summarized, viewMode, isExpanded, onToggleExpanded }:
|
||||
</div>
|
||||
<p className="text-sm mt-1">{displayMessage}</p>
|
||||
{viewMode === 'friendly' && summarized.isGroup && summarized.count > 1 && (
|
||||
<p className="text-xs text-muted-foreground mt-1">{summarized.count} similar events</p>
|
||||
<p className="text-xs text-muted-foreground mt-1">
|
||||
{summarized.count} similar events
|
||||
</p>
|
||||
)}
|
||||
</div>
|
||||
{(hasDetails || viewMode === 'friendly') && (
|
||||
<CollapsibleTrigger asChild>
|
||||
<Button variant="ghost" size="sm" className="shrink-0" aria-label="Toggle log details">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
className="shrink-0"
|
||||
aria-label="Toggle log details"
|
||||
>
|
||||
<ChevronDown
|
||||
className={cn('h-4 w-4 transition-transform', isExpanded && 'rotate-180')}
|
||||
/>
|
||||
@@ -120,12 +128,12 @@ export function LogEntry({ summarized, viewMode, isExpanded, onToggleExpanded }:
|
||||
|
||||
<CollapsibleContent>
|
||||
<LogEntryDetails
|
||||
log={log}
|
||||
summarized={summarized}
|
||||
viewMode={viewMode}
|
||||
sourceColors={sourceColors}
|
||||
/>
|
||||
</CollapsibleContent>
|
||||
log={safeLog}
|
||||
summarized={summarized}
|
||||
viewMode={viewMode}
|
||||
sourceColors={sourceColors}
|
||||
/>
|
||||
</CollapsibleContent>
|
||||
</div>
|
||||
</Collapsible>
|
||||
);
|
||||
@@ -149,9 +157,7 @@ function LogEntryDetails({ log, summarized, viewMode, sourceColors }: LogEntryDe
|
||||
<Badge variant="outline" className={cn('text-xs', sourceColors[log.source])}>
|
||||
{log.source}
|
||||
</Badge>
|
||||
<Badge variant="secondary">
|
||||
{log.origin}
|
||||
</Badge>
|
||||
<Badge variant="secondary">{log.origin}</Badge>
|
||||
<span className="font-mono">{format(new Date(log.timestamp), 'HH:mm:ss.SSS')}</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -16,14 +16,11 @@ import {
|
||||
Layers,
|
||||
} from 'lucide-react';
|
||||
import { useState } from 'react';
|
||||
import type { VirtualItem } from '@tanstack/react-virtual';
|
||||
import { Badge } from '@/components/ui/badge';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Card, CardContent, CardHeader} from '@/components/ui/card';
|
||||
import {
|
||||
Collapsible,
|
||||
CollapsibleContent,
|
||||
CollapsibleTrigger,
|
||||
} from '@/components/ui/collapsible';
|
||||
import { Card, CardContent, CardHeader } from '@/components/ui/card';
|
||||
import { Collapsible, CollapsibleContent, CollapsibleTrigger } from '@/components/ui/collapsible';
|
||||
import { formatGap, type LogGroup } from '@/lib/log-groups';
|
||||
import { isErrorGroup, isWarningGroup } from '@/lib/log-group-summarizer';
|
||||
import { cn } from '@/lib/utils';
|
||||
@@ -42,6 +39,12 @@ interface LogTimelineProps {
|
||||
readonly expandedLogs: ReadonlySet<string>;
|
||||
/** Callback when a log is toggled */
|
||||
readonly onToggleLog: (id: string) => void;
|
||||
/** Optional virtualization items */
|
||||
readonly virtualItems?: readonly VirtualItem[];
|
||||
/** Optional virtualization total size */
|
||||
readonly virtualTotalSize?: number;
|
||||
/** Optional measure element hook for virtualization */
|
||||
readonly measureElement?: (element: Element | null) => void;
|
||||
}
|
||||
|
||||
/** Props for a single timeline group */
|
||||
@@ -235,19 +238,73 @@ export function LogTimeline({
|
||||
maxLogsPerGroup = 10,
|
||||
expandedLogs,
|
||||
onToggleLog,
|
||||
virtualItems,
|
||||
virtualTotalSize,
|
||||
measureElement,
|
||||
}: LogTimelineProps) {
|
||||
if (groups.length === 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const isVirtualized =
|
||||
Boolean(virtualItems?.length) && typeof virtualTotalSize === 'number' && measureElement;
|
||||
|
||||
if (isVirtualized) {
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
height: virtualTotalSize,
|
||||
position: 'relative',
|
||||
}}
|
||||
>
|
||||
{virtualItems?.map((virtualRow) => {
|
||||
const group = groups[virtualRow.index];
|
||||
if (!group) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const previousGroup =
|
||||
virtualRow.index > 0 ? groups[virtualRow.index - 1] : undefined;
|
||||
const gapFromPrevious = previousGroup
|
||||
? previousGroup.startTime - group.endTime
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<div
|
||||
key={group.id}
|
||||
ref={measureElement}
|
||||
className="pb-2"
|
||||
data-index={virtualRow.index}
|
||||
style={{
|
||||
position: 'absolute',
|
||||
top: 0,
|
||||
left: 0,
|
||||
width: '100%',
|
||||
transform: `translateY(${virtualRow.start}px)`,
|
||||
}}
|
||||
>
|
||||
<TimelineGroup
|
||||
group={group}
|
||||
viewMode={viewMode}
|
||||
maxLogs={maxLogsPerGroup}
|
||||
expandedLogs={expandedLogs}
|
||||
onToggleLog={onToggleLog}
|
||||
isFirst={virtualRow.index === 0}
|
||||
gapFromPrevious={gapFromPrevious}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<div className="space-y-2">
|
||||
{groups.map((group, index) => {
|
||||
// Calculate gap from previous group
|
||||
const previousGroup = index > 0 ? groups[index - 1] : undefined;
|
||||
const gapFromPrevious = previousGroup
|
||||
? previousGroup.startTime - group.endTime
|
||||
: undefined;
|
||||
const gapFromPrevious = previousGroup ? previousGroup.startTime - group.endTime : undefined;
|
||||
|
||||
return (
|
||||
<TimelineGroup
|
||||
|
||||
143
client/src/components/analytics/logs-tab-list.tsx
Normal file
143
client/src/components/analytics/logs-tab-list.tsx
Normal file
@@ -0,0 +1,143 @@
|
||||
import { FileText, RefreshCw } from 'lucide-react';
|
||||
import type { RefObject } from 'react';
|
||||
|
||||
import type { LogLevel, LogSource } from '@/api/types';
|
||||
import { LogEntry as LogEntryComponent, type LogEntryData } from '@/components/analytics/log-entry';
|
||||
import { LogTimeline } from '@/components/analytics/log-timeline';
|
||||
import { ScrollArea } from '@/components/ui/scroll-area';
|
||||
import type { LogGroup, GroupMode } from '@/lib/log-groups';
|
||||
import type { SummarizedLog } from '@/lib/log-summarizer';
|
||||
|
||||
type LogOrigin = 'client' | 'server';
|
||||
type ViewMode = 'friendly' | 'technical';
|
||||
type VirtualLogRow = { index: number; start: number };
|
||||
|
||||
interface LogsTabListProps {
|
||||
isLoading: boolean;
|
||||
filteredLogs: LogEntryData[];
|
||||
summarizedLogs: SummarizedLog<LogEntryData>[];
|
||||
logGroups: LogGroup[];
|
||||
groupMode: GroupMode;
|
||||
viewMode: ViewMode;
|
||||
expandedLogs: ReadonlySet<string>;
|
||||
onToggleExpanded: (id: string) => void;
|
||||
searchQuery: string;
|
||||
levelFilter: LogLevel | 'all';
|
||||
sourceFilter: LogSource | 'all';
|
||||
originFilter: LogOrigin | 'all';
|
||||
shouldVirtualizeLogs: boolean;
|
||||
shouldVirtualizeGroups: boolean;
|
||||
viewportRef: RefObject<HTMLDivElement>;
|
||||
virtualItems: VirtualLogRow[];
|
||||
virtualTotalSize: number;
|
||||
measureElement: (element: Element | null) => void;
|
||||
groupVirtualItems: VirtualLogRow[];
|
||||
groupVirtualTotalSize: number;
|
||||
groupMeasureElement: (element: Element | null) => void;
|
||||
}
|
||||
|
||||
export function LogsTabList({
|
||||
isLoading,
|
||||
filteredLogs,
|
||||
summarizedLogs,
|
||||
logGroups,
|
||||
groupMode,
|
||||
viewMode,
|
||||
expandedLogs,
|
||||
onToggleExpanded,
|
||||
searchQuery,
|
||||
levelFilter,
|
||||
sourceFilter,
|
||||
originFilter,
|
||||
shouldVirtualizeLogs,
|
||||
shouldVirtualizeGroups,
|
||||
viewportRef,
|
||||
virtualItems,
|
||||
virtualTotalSize,
|
||||
measureElement,
|
||||
groupVirtualItems,
|
||||
groupVirtualTotalSize,
|
||||
groupMeasureElement,
|
||||
}: LogsTabListProps) {
|
||||
return (
|
||||
<ScrollArea className="h-[500px] pr-4" viewportRef={viewportRef}>
|
||||
{isLoading ? (
|
||||
<div className="text-center py-12 text-muted-foreground">
|
||||
<RefreshCw className="h-12 w-12 mx-auto mb-3 opacity-50 animate-spin" />
|
||||
<p>Loading logs...</p>
|
||||
</div>
|
||||
) : filteredLogs.length === 0 ? (
|
||||
<div className="text-center py-12 text-muted-foreground">
|
||||
<FileText className="h-12 w-12 mx-auto mb-3 opacity-50" />
|
||||
<p>No logs found</p>
|
||||
<p className="text-sm">
|
||||
{searchQuery ||
|
||||
levelFilter !== 'all' ||
|
||||
sourceFilter !== 'all' ||
|
||||
originFilter !== 'all'
|
||||
? 'Try adjusting your filters'
|
||||
: 'Logs will appear here as events occur'}
|
||||
</p>
|
||||
</div>
|
||||
) : groupMode !== 'none' ? (
|
||||
<LogTimeline
|
||||
groups={logGroups}
|
||||
viewMode={viewMode}
|
||||
expandedLogs={expandedLogs}
|
||||
onToggleLog={onToggleExpanded}
|
||||
virtualItems={shouldVirtualizeGroups ? groupVirtualItems : undefined}
|
||||
virtualTotalSize={shouldVirtualizeGroups ? groupVirtualTotalSize : undefined}
|
||||
measureElement={shouldVirtualizeGroups ? groupMeasureElement : undefined}
|
||||
/>
|
||||
) : shouldVirtualizeLogs ? (
|
||||
<div
|
||||
style={{
|
||||
height: virtualTotalSize,
|
||||
position: 'relative',
|
||||
}}
|
||||
>
|
||||
{virtualItems.map((virtualRow) => {
|
||||
const summarized = summarizedLogs[virtualRow.index];
|
||||
if (!summarized) {
|
||||
return null;
|
||||
}
|
||||
return (
|
||||
<div
|
||||
key={summarized.log.id}
|
||||
ref={measureElement}
|
||||
className="pb-2"
|
||||
data-index={virtualRow.index}
|
||||
style={{
|
||||
position: 'absolute',
|
||||
top: 0,
|
||||
left: 0,
|
||||
width: '100%',
|
||||
transform: `translateY(${virtualRow.start}px)`,
|
||||
}}
|
||||
>
|
||||
<LogEntryComponent
|
||||
summarized={summarized as SummarizedLog<LogEntryData>}
|
||||
viewMode={viewMode}
|
||||
isExpanded={expandedLogs.has(summarized.log.id)}
|
||||
onToggleExpanded={() => onToggleExpanded(summarized.log.id)}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
) : (
|
||||
<div className="space-y-2">
|
||||
{summarizedLogs.map((summarized) => (
|
||||
<LogEntryComponent
|
||||
key={summarized.log.id}
|
||||
summarized={summarized as SummarizedLog<LogEntryData>}
|
||||
viewMode={viewMode}
|
||||
isExpanded={expandedLogs.has(summarized.log.id)}
|
||||
onToggleExpanded={() => onToggleExpanded(summarized.log.id)}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</ScrollArea>
|
||||
);
|
||||
}
|
||||
@@ -7,6 +7,8 @@ import type { GetRecentLogsResponse, LogEntry } from '@/api/types';
|
||||
import { addClientLog, clearClientLogs } from '@/lib/client-logs';
|
||||
import { LogsTab } from './logs-tab';
|
||||
|
||||
const SEARCH_DEBOUNCE_MS = 200;
|
||||
|
||||
// Mock the API module
|
||||
vi.mock('@/api/interface', () => ({
|
||||
getAPI: vi.fn(),
|
||||
@@ -23,16 +25,20 @@ const clientLogState = vi.hoisted(() => ({
|
||||
metadata?: Record<string, string>;
|
||||
origin: 'client';
|
||||
}>,
|
||||
listeners: new Set<(logs: Array<{
|
||||
id: string;
|
||||
timestamp: number;
|
||||
level: string;
|
||||
source: string;
|
||||
message: string;
|
||||
details?: string;
|
||||
metadata?: Record<string, string>;
|
||||
origin: 'client';
|
||||
}>) => void>(),
|
||||
listeners: new Set<
|
||||
(
|
||||
logs: Array<{
|
||||
id: string;
|
||||
timestamp: number;
|
||||
level: string;
|
||||
source: string;
|
||||
message: string;
|
||||
details?: string;
|
||||
metadata?: Record<string, string>;
|
||||
origin: 'client';
|
||||
}>
|
||||
) => void
|
||||
>(),
|
||||
}));
|
||||
|
||||
vi.mock('@/lib/client-logs', () => ({
|
||||
@@ -328,8 +334,14 @@ describe('LogsTab', () => {
|
||||
|
||||
// Search for "Connection"
|
||||
const searchInput = screen.getByPlaceholderText('Search logs...');
|
||||
vi.useFakeTimers();
|
||||
fireEvent.change(searchInput, { target: { value: 'Connection' } });
|
||||
|
||||
act(() => {
|
||||
vi.advanceTimersByTime(SEARCH_DEBOUNCE_MS);
|
||||
});
|
||||
vi.useRealTimers();
|
||||
|
||||
expect(screen.getAllByText('Connection established').length).toBeGreaterThan(0);
|
||||
expect(screen.getAllByText('Connection closed').length).toBeGreaterThan(0);
|
||||
expect(screen.queryAllByText('User logged in')).toHaveLength(0);
|
||||
@@ -352,8 +364,14 @@ describe('LogsTab', () => {
|
||||
});
|
||||
|
||||
const searchInput = screen.getByPlaceholderText('Search logs...');
|
||||
vi.useFakeTimers();
|
||||
fireEvent.change(searchInput, { target: { value: 'req-99' } });
|
||||
|
||||
act(() => {
|
||||
vi.advanceTimersByTime(SEARCH_DEBOUNCE_MS);
|
||||
});
|
||||
vi.useRealTimers();
|
||||
|
||||
expect(screen.getAllByText('Metadata log').length).toBeGreaterThan(0);
|
||||
expect(screen.queryAllByText('Other log')).toHaveLength(0);
|
||||
});
|
||||
@@ -454,9 +472,7 @@ describe('LogsTab', () => {
|
||||
it('exports logs and revokes the object URL', async () => {
|
||||
const createObjectURL = vi.fn(() => 'blob:logs');
|
||||
const revokeObjectURL = vi.fn();
|
||||
const clickMock = vi
|
||||
.spyOn(HTMLAnchorElement.prototype, 'click')
|
||||
.mockImplementation(() => {});
|
||||
const clickMock = vi.spyOn(HTMLAnchorElement.prototype, 'click').mockImplementation(() => {});
|
||||
vi.stubGlobal('URL', { createObjectURL, revokeObjectURL });
|
||||
|
||||
mockAPI.getRecentLogs.mockResolvedValue({
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { useQuery } from '@tanstack/react-query';
|
||||
import { useVirtualizer } from '@tanstack/react-virtual';
|
||||
import { format } from 'date-fns';
|
||||
import {
|
||||
Clock,
|
||||
@@ -13,18 +14,17 @@ import {
|
||||
Search,
|
||||
Terminal,
|
||||
} from 'lucide-react';
|
||||
import { useEffect, useMemo, useState } from 'react';
|
||||
import { useEffect, useMemo, useRef, useState } from 'react';
|
||||
import { Timing } from '@/api/constants';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import type { LogLevel as ApiLogLevel, LogSource as ApiLogSource } from '@/api/types';
|
||||
import { LogEntry as LogEntryComponent, type LogEntryData } from '@/components/analytics/log-entry';
|
||||
import type { LogEntryData } from '@/components/analytics/log-entry';
|
||||
import { levelConfig } from '@/components/analytics/log-entry-config';
|
||||
import { AnalyticsCardTitle } from '@/components/analytics/analytics-card-title';
|
||||
import { LogTimeline } from '@/components/analytics/log-timeline';
|
||||
import { LogsTabList } from '@/components/analytics/logs-tab-list';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Card, CardContent, CardDescription, CardHeader } from '@/components/ui/card';
|
||||
import { Input } from '@/components/ui/input';
|
||||
import { ScrollArea } from '@/components/ui/scroll-area';
|
||||
import {
|
||||
Select,
|
||||
SelectContent,
|
||||
@@ -33,19 +33,12 @@ import {
|
||||
SelectValue,
|
||||
} from '@/components/ui/select';
|
||||
import { ToggleGroup, ToggleGroupItem } from '@/components/ui/toggle-group';
|
||||
import {
|
||||
Tooltip,
|
||||
TooltipContent,
|
||||
TooltipProvider,
|
||||
TooltipTrigger,
|
||||
} from '@/components/ui/tooltip';
|
||||
import {
|
||||
getClientLogs,
|
||||
subscribeClientLogs,
|
||||
type ClientLogEntry,
|
||||
} from '@/lib/client-logs';
|
||||
import { Tooltip, TooltipContent, TooltipProvider, TooltipTrigger } from '@/components/ui/tooltip';
|
||||
import { getClientLogs, subscribeClientLogs, type ClientLogEntry } from '@/lib/client-logs';
|
||||
import { buildExportBlob, downloadBlob } from '@/lib/download-utils';
|
||||
import { convertLogEntry } from '@/lib/log-converters';
|
||||
import { groupLogs, type GroupMode } from '@/lib/log-groups';
|
||||
import { sanitizeLogEntry } from '@/lib/log-sanitizer';
|
||||
import {
|
||||
summarizeConsecutive,
|
||||
type SummarizableLog,
|
||||
@@ -60,9 +53,17 @@ type LogOrigin = 'client' | 'server';
|
||||
type ViewMode = 'friendly' | 'technical';
|
||||
|
||||
const LOG_LEVELS: LogLevel[] = ['info', 'warning', 'error', 'debug'];
|
||||
const LOG_VIRTUALIZE_THRESHOLD = 200;
|
||||
const LOG_ESTIMATED_ROW_HEIGHT = 120;
|
||||
const LOG_OVERSCAN = 8;
|
||||
const LOG_SEARCH_DEBOUNCE_MS = 200;
|
||||
const LOG_GROUP_VIRTUALIZE_THRESHOLD = 40;
|
||||
const LOG_GROUP_ESTIMATED_ROW_HEIGHT = 220;
|
||||
const LOG_GROUP_OVERSCAN = 6;
|
||||
|
||||
export function LogsTab() {
|
||||
const [searchQuery, setSearchQuery] = useState('');
|
||||
const [debouncedQuery, setDebouncedQuery] = useState('');
|
||||
const [levelFilter, setLevelFilter] = useState<LogLevel | 'all'>('all');
|
||||
const [sourceFilter, setSourceFilter] = useState<LogSource | 'all'>('all');
|
||||
const [originFilter, setOriginFilter] = useState<LogOrigin | 'all'>('all');
|
||||
@@ -71,6 +72,17 @@ export function LogsTab() {
|
||||
const [viewMode, setViewMode] = useState<ViewMode>('friendly');
|
||||
const [enableSummarization, setEnableSummarization] = useState(true);
|
||||
const [groupMode, setGroupMode] = useState<GroupMode>('none');
|
||||
const logViewportRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
useEffect(() => {
|
||||
if (searchQuery === debouncedQuery) {
|
||||
return;
|
||||
}
|
||||
const handle = setTimeout(() => {
|
||||
setDebouncedQuery(searchQuery);
|
||||
}, LOG_SEARCH_DEBOUNCE_MS);
|
||||
return () => clearTimeout(handle);
|
||||
}, [searchQuery, debouncedQuery]);
|
||||
|
||||
useEffect(() => subscribeClientLogs(setClientLogs), []);
|
||||
|
||||
@@ -111,7 +123,7 @@ export function LogsTab() {
|
||||
|
||||
// Client-side search filtering (level/source already filtered by API)
|
||||
const filteredLogs = useMemo(() => {
|
||||
const query = searchQuery.toLowerCase();
|
||||
const query = debouncedQuery.toLowerCase();
|
||||
return mergedLogs.filter((log) => {
|
||||
if (originFilter !== 'all' && log.origin !== originFilter) {
|
||||
return false;
|
||||
@@ -134,7 +146,7 @@ export function LogsTab() {
|
||||
correlationText.includes(query)
|
||||
);
|
||||
});
|
||||
}, [mergedLogs, searchQuery, originFilter, levelFilter, sourceFilter]);
|
||||
}, [mergedLogs, debouncedQuery, originFilter, levelFilter, sourceFilter]);
|
||||
|
||||
// Apply summarization when enabled
|
||||
const summarizedLogs = useMemo(() => {
|
||||
@@ -149,6 +161,15 @@ export function LogsTab() {
|
||||
return summarizeConsecutive(filteredLogs as SummarizableLog[]) as SummarizedLog<LogEntryData>[];
|
||||
}, [filteredLogs, enableSummarization]);
|
||||
|
||||
const logVirtualizer = useVirtualizer({
|
||||
count: summarizedLogs.length,
|
||||
getScrollElement: () => logViewportRef.current,
|
||||
estimateSize: () => LOG_ESTIMATED_ROW_HEIGHT,
|
||||
overscan: LOG_OVERSCAN,
|
||||
});
|
||||
|
||||
const shouldVirtualizeLogs = summarizedLogs.length > LOG_VIRTUALIZE_THRESHOLD;
|
||||
|
||||
// Group logs when in timeline mode
|
||||
const logGroups = useMemo(() => {
|
||||
if (groupMode === 'none') {
|
||||
@@ -157,6 +178,16 @@ export function LogsTab() {
|
||||
return groupLogs(filteredLogs, groupMode);
|
||||
}, [filteredLogs, groupMode]);
|
||||
|
||||
const groupVirtualizer = useVirtualizer({
|
||||
count: logGroups.length,
|
||||
getScrollElement: () => logViewportRef.current,
|
||||
estimateSize: () => LOG_GROUP_ESTIMATED_ROW_HEIGHT,
|
||||
overscan: LOG_GROUP_OVERSCAN,
|
||||
});
|
||||
|
||||
const shouldVirtualizeGroups =
|
||||
groupMode !== 'none' && logGroups.length > LOG_GROUP_VIRTUALIZE_THRESHOLD;
|
||||
|
||||
const logStats = useMemo<Record<LogLevel, number>>(() => {
|
||||
return filteredLogs.reduce(
|
||||
(stats, log) => {
|
||||
@@ -184,13 +215,9 @@ export function LogsTab() {
|
||||
};
|
||||
|
||||
const exportLogs = () => {
|
||||
const blob = new Blob([JSON.stringify(filteredLogs, null, 2)], { type: 'application/json' });
|
||||
const url = URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
a.href = url;
|
||||
a.download = `logs-${format(new Date(), 'yyyy-MM-dd-HHmmss')}.json`;
|
||||
a.click();
|
||||
URL.revokeObjectURL(url);
|
||||
const sanitizedLogs = filteredLogs.map(sanitizeLogEntry);
|
||||
const blob = buildExportBlob('json', JSON.stringify(sanitizedLogs, null, 2));
|
||||
downloadBlob(blob, `logs-${format(new Date(), 'yyyy-MM-dd-HHmmss')}.json`);
|
||||
};
|
||||
|
||||
return (
|
||||
@@ -405,58 +432,41 @@ export function LogsTab() {
|
||||
|
||||
{groupMode !== 'none' ? (
|
||||
<span className="text-xs text-muted-foreground">
|
||||
{logGroups.length} group{logGroups.length !== 1 ? 's' : ''},{' '}
|
||||
{filteredLogs.length} total logs
|
||||
{logGroups.length} group{logGroups.length !== 1 ? 's' : ''}, {filteredLogs.length}{' '}
|
||||
total logs
|
||||
</span>
|
||||
) : enableSummarization && summarizedLogs.some((s) => s.isGroup) ? (
|
||||
<span className="text-xs text-muted-foreground">
|
||||
{summarizedLogs.filter((s) => s.isGroup).length} groups,{' '}
|
||||
{filteredLogs.length} total logs
|
||||
{summarizedLogs.filter((s) => s.isGroup).length} groups, {filteredLogs.length} total
|
||||
logs
|
||||
</span>
|
||||
) : null}
|
||||
</div>
|
||||
|
||||
{/* Log List */}
|
||||
<ScrollArea className="h-[500px] pr-4">
|
||||
{isLoading ? (
|
||||
<div className="text-center py-12 text-muted-foreground">
|
||||
<RefreshCw className="h-12 w-12 mx-auto mb-3 opacity-50 animate-spin" />
|
||||
<p>Loading logs...</p>
|
||||
</div>
|
||||
) : filteredLogs.length === 0 ? (
|
||||
<div className="text-center py-12 text-muted-foreground">
|
||||
<FileText className="h-12 w-12 mx-auto mb-3 opacity-50" />
|
||||
<p>No logs found</p>
|
||||
<p className="text-sm">
|
||||
{searchQuery ||
|
||||
levelFilter !== 'all' ||
|
||||
sourceFilter !== 'all' ||
|
||||
originFilter !== 'all'
|
||||
? 'Try adjusting your filters'
|
||||
: 'Logs will appear here as events occur'}
|
||||
</p>
|
||||
</div>
|
||||
) : groupMode !== 'none' ? (
|
||||
<LogTimeline
|
||||
groups={logGroups}
|
||||
viewMode={viewMode}
|
||||
expandedLogs={expandedLogs}
|
||||
onToggleLog={toggleExpanded}
|
||||
/>
|
||||
) : (
|
||||
<div className="space-y-2">
|
||||
{summarizedLogs.map((summarized) => (
|
||||
<LogEntryComponent
|
||||
key={summarized.log.id}
|
||||
summarized={summarized as SummarizedLog<LogEntryData>}
|
||||
viewMode={viewMode}
|
||||
isExpanded={expandedLogs.has(summarized.log.id)}
|
||||
onToggleExpanded={() => toggleExpanded(summarized.log.id)}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</ScrollArea>
|
||||
<LogsTabList
|
||||
isLoading={isLoading}
|
||||
filteredLogs={filteredLogs}
|
||||
summarizedLogs={summarizedLogs as SummarizedLog<LogEntryData>[]}
|
||||
logGroups={logGroups}
|
||||
groupMode={groupMode}
|
||||
viewMode={viewMode}
|
||||
expandedLogs={expandedLogs}
|
||||
onToggleExpanded={toggleExpanded}
|
||||
searchQuery={searchQuery}
|
||||
levelFilter={levelFilter}
|
||||
sourceFilter={sourceFilter}
|
||||
originFilter={originFilter}
|
||||
shouldVirtualizeLogs={shouldVirtualizeLogs}
|
||||
shouldVirtualizeGroups={shouldVirtualizeGroups}
|
||||
viewportRef={logViewportRef}
|
||||
virtualItems={logVirtualizer.getVirtualItems()}
|
||||
virtualTotalSize={logVirtualizer.getTotalSize()}
|
||||
measureElement={logVirtualizer.measureElement}
|
||||
groupVirtualItems={groupVirtualizer.getVirtualItems()}
|
||||
groupVirtualTotalSize={groupVirtualizer.getTotalSize()}
|
||||
groupMeasureElement={groupVirtualizer.measureElement}
|
||||
/>
|
||||
|
||||
{/* Footer */}
|
||||
<div className="mt-4 pt-4 border-t flex items-center justify-between text-sm text-muted-foreground">
|
||||
|
||||
@@ -50,8 +50,7 @@ export function ConfirmationDialog({
|
||||
onOpenChange(false);
|
||||
};
|
||||
|
||||
const descriptionContent =
|
||||
typeof description === 'string' ? <p>{description}</p> : description;
|
||||
const descriptionContent = typeof description === 'string' ? <p>{description}</p> : description;
|
||||
|
||||
return (
|
||||
<AlertDialog open={open} onOpenChange={onOpenChange}>
|
||||
@@ -61,9 +60,7 @@ export function ConfirmationDialog({
|
||||
{icon ?? <AlertTriangle className="h-5 w-5 text-amber-500" />}
|
||||
{title}
|
||||
</AlertDialogTitle>
|
||||
<AlertDialogDescription asChild>
|
||||
{descriptionContent}
|
||||
</AlertDialogDescription>
|
||||
<AlertDialogDescription asChild>{descriptionContent}</AlertDialogDescription>
|
||||
</AlertDialogHeader>
|
||||
{extraContent}
|
||||
<AlertDialogFooter>
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
import { CloudOff, Loader2, RefreshCw, Wifi, WifiOff } from 'lucide-react';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import type { ServerInfo } from '@/api/types';
|
||||
import { useConnectionState } from '@/contexts/connection-state';
|
||||
import { type ConnectionStatusType, ConnectionStatus as ConnStatus, iconSize } from '@/lib/styles';
|
||||
|
||||
@@ -68,13 +68,7 @@ function subscribeToProfileSamples(listener: ProfileListener): () => void {
|
||||
return () => profileListeners.delete(listener);
|
||||
}
|
||||
|
||||
export function DevProfiler({
|
||||
id,
|
||||
children,
|
||||
}: {
|
||||
id: string;
|
||||
children: ReactNode;
|
||||
}): JSX.Element {
|
||||
export function DevProfiler({ id, children }: { id: string; children: ReactNode }): JSX.Element {
|
||||
const [enabled, setEnabled] = useState(readProfilerEnabled);
|
||||
|
||||
useEffect(() => {
|
||||
@@ -102,17 +96,16 @@ export function DevProfiler({
|
||||
}, []);
|
||||
|
||||
const onRender = useMemo<ProfilerOnRenderCallback>(
|
||||
() =>
|
||||
(profileId, phase, actualDuration, baseDuration, startTime, commitTime) => {
|
||||
emitProfileSample({
|
||||
id: profileId,
|
||||
phase,
|
||||
actualDuration,
|
||||
baseDuration,
|
||||
startTime,
|
||||
commitTime,
|
||||
});
|
||||
},
|
||||
() => (profileId, phase, actualDuration, baseDuration, startTime, commitTime) => {
|
||||
emitProfileSample({
|
||||
id: profileId,
|
||||
phase,
|
||||
actualDuration,
|
||||
baseDuration,
|
||||
startTime,
|
||||
commitTime,
|
||||
});
|
||||
},
|
||||
[]
|
||||
);
|
||||
|
||||
|
||||
@@ -63,9 +63,7 @@ export function CalendarConfig({
|
||||
<SecretInput
|
||||
label="Client Secret"
|
||||
value={oauthConfig.client_secret}
|
||||
onChange={(value) =>
|
||||
onUpdate({ oauth_config: { ...oauthConfig, client_secret: value } })
|
||||
}
|
||||
onChange={(value) => onUpdate({ oauth_config: { ...oauthConfig, client_secret: value } })}
|
||||
placeholder="Enter client secret"
|
||||
showSecret={showSecrets.calendar_client_secret ?? false}
|
||||
onToggleSecret={() => toggleSecret('calendar_client_secret')}
|
||||
|
||||
@@ -26,9 +26,7 @@ function createStepState(overrides: Partial<StepState> = {}): StepState {
|
||||
}
|
||||
|
||||
/** Factory to create a complete processing state */
|
||||
function createProcessingState(
|
||||
overrides: Partial<PostProcessingState> = {}
|
||||
): PostProcessingState {
|
||||
function createProcessingState(overrides: Partial<PostProcessingState> = {}): PostProcessingState {
|
||||
return {
|
||||
meetingId: null,
|
||||
summary: createStepState(),
|
||||
@@ -276,9 +274,7 @@ describe('ProcessingStatus', () => {
|
||||
describe('className prop', () => {
|
||||
it('applies custom className to full mode container', () => {
|
||||
const state = createProcessingState({ meetingId: 'meeting-123' });
|
||||
const { container } = render(
|
||||
<ProcessingStatus state={state} className="custom-class" />
|
||||
);
|
||||
const { container } = render(<ProcessingStatus state={state} className="custom-class" />);
|
||||
|
||||
expect(container.firstChild).toHaveClass('custom-class');
|
||||
});
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
import { Trash2, UserPlus } from 'lucide-react';
|
||||
import { useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import type { ProjectMembership, ProjectRole } from '@/api/types';
|
||||
import { useProjectMembers } from '@/hooks/use-project-members';
|
||||
import { useGuardedMutation } from '@/hooks/use-guarded-mutation';
|
||||
|
||||
@@ -203,7 +203,7 @@ export function ProjectSettingsPanel({ project }: { project: Project }) {
|
||||
|
||||
const selectedTemplate =
|
||||
defaultTemplateId !== 'inherit'
|
||||
? templates.find((template) => template.id === defaultTemplateId) ?? null
|
||||
? (templates.find((template) => template.id === defaultTemplateId) ?? null)
|
||||
: null;
|
||||
const isMissingTemplate = defaultTemplateId !== 'inherit' && !selectedTemplate;
|
||||
|
||||
|
||||
@@ -69,6 +69,12 @@ export function AudioDeviceSelector({
|
||||
displayLabel = inferLabelFromDeviceId(effectiveInputDevice);
|
||||
}
|
||||
|
||||
const compactAriaLabel = compact
|
||||
? hasPermission === false
|
||||
? 'Grant audio access'
|
||||
: `Select microphone: ${displayLabel}`
|
||||
: undefined;
|
||||
|
||||
// Truncate long labels
|
||||
const truncatedLabel =
|
||||
displayLabel.length > 25 ? `${displayLabel.substring(0, 22)}...` : displayLabel;
|
||||
@@ -79,6 +85,7 @@ export function AudioDeviceSelector({
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={loadDevices}
|
||||
aria-label={compactAriaLabel}
|
||||
className={cn('gap-2', className)}
|
||||
disabled={disabled}
|
||||
>
|
||||
@@ -95,6 +102,7 @@ export function AudioDeviceSelector({
|
||||
variant="outline"
|
||||
size="sm"
|
||||
disabled={disabled || isLoading}
|
||||
aria-label={compactAriaLabel}
|
||||
className={cn('gap-2', className)}
|
||||
>
|
||||
{isLoading ? (
|
||||
|
||||
@@ -3,6 +3,10 @@
|
||||
import { useEffect, useState } from 'react';
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
const LEVEL_PERCENT_MAX = 100;
|
||||
const MIN_BAR_HEIGHT_PERCENT = 8;
|
||||
const RANDOM_LEVEL_INTERVAL_MS = 100;
|
||||
|
||||
interface AudioLevelMeterProps {
|
||||
isActive: boolean;
|
||||
barCount?: number;
|
||||
@@ -14,7 +18,7 @@ function buildLevels(count: number, value: number): number[] {
|
||||
}
|
||||
|
||||
function buildRandomLevels(count: number): number[] {
|
||||
return Array.from({ length: count }, () => Math.random() * 100);
|
||||
return Array.from({ length: count }, () => Math.random() * LEVEL_PERCENT_MAX);
|
||||
}
|
||||
|
||||
function normalizeLevel(level: number): number {
|
||||
@@ -22,7 +26,7 @@ function normalizeLevel(level: number): number {
|
||||
return 0;
|
||||
}
|
||||
if (level > 1) {
|
||||
return Math.min(level, 100) / 100;
|
||||
return Math.min(level, LEVEL_PERCENT_MAX) / LEVEL_PERCENT_MAX;
|
||||
}
|
||||
return Math.max(level, 0);
|
||||
}
|
||||
@@ -44,7 +48,7 @@ export function AudioLevelMeter({ isActive, barCount = 12, level }: AudioLevelMe
|
||||
|
||||
const interval = setInterval(() => {
|
||||
setLevels(buildRandomLevels(barCount));
|
||||
}, 100);
|
||||
}, RANDOM_LEVEL_INTERVAL_MS);
|
||||
|
||||
return () => clearInterval(interval);
|
||||
}, [isActive, barCount, level]);
|
||||
@@ -58,7 +62,12 @@ export function AudioLevelMeter({ isActive, barCount = 12, level }: AudioLevelMe
|
||||
'w-full rounded-sm transition-all duration-100',
|
||||
isActive ? 'bg-success' : 'bg-muted'
|
||||
)}
|
||||
style={{ height: `${Math.max(8, isActive ? level : 8)}%` }}
|
||||
style={{
|
||||
height: `${Math.max(
|
||||
MIN_BAR_HEIGHT_PERCENT,
|
||||
isActive ? level : MIN_BAR_HEIGHT_PERCENT
|
||||
)}%`,
|
||||
}}
|
||||
/>
|
||||
))}
|
||||
</div>
|
||||
|
||||
@@ -300,8 +300,7 @@ export function getSuggestedModelSize(
|
||||
return sorted[Math.floor(sorted.length / 2)] ?? null;
|
||||
}
|
||||
|
||||
const targetCompute =
|
||||
computeType ?? (device === 'cuda' ? 'float16' : 'int8');
|
||||
const targetCompute = computeType ?? (device === 'cuda' ? 'float16' : 'int8');
|
||||
const descending = [...sorted].reverse();
|
||||
for (const size of descending) {
|
||||
const estimate = estimateResourceUsage(size, targetCompute);
|
||||
@@ -342,8 +341,8 @@ export function getSuggestedComputeType(
|
||||
return 'int8';
|
||||
}
|
||||
} else if (device === 'cpu' && types.includes('int8')) {
|
||||
return 'int8';
|
||||
}
|
||||
return 'int8';
|
||||
}
|
||||
return types[0] ?? null;
|
||||
}
|
||||
|
||||
@@ -360,8 +359,8 @@ export function getSuggestedComputeType(
|
||||
return 'int8';
|
||||
}
|
||||
} else if (device === 'cpu' && types.includes('int8')) {
|
||||
return 'int8';
|
||||
}
|
||||
return 'int8';
|
||||
}
|
||||
return types[0] ?? null;
|
||||
}
|
||||
|
||||
@@ -426,7 +425,10 @@ export function resolveModelSize(pending: string | null, config: ASRConfiguratio
|
||||
}
|
||||
|
||||
/** Resolve effective device from pending state or config. */
|
||||
export function resolveDevice(pending: ASRDevice | null, config: ASRConfiguration | null): ASRDevice {
|
||||
export function resolveDevice(
|
||||
pending: ASRDevice | null,
|
||||
config: ASRConfiguration | null
|
||||
): ASRDevice {
|
||||
if (pending !== null) {
|
||||
return pending;
|
||||
}
|
||||
|
||||
@@ -22,7 +22,10 @@ import { useHuggingFaceToken } from '@/hooks/use-huggingface-token';
|
||||
import { iconSize } from '@/lib/styles';
|
||||
import { cn } from '@/lib/utils';
|
||||
import { ModelAuthSection, getModelAuthSummary } from './model-auth-section';
|
||||
import { TranscriptionEngineSection, getTranscriptionEngineSummary } from './transcription-engine-section';
|
||||
import {
|
||||
TranscriptionEngineSection,
|
||||
getTranscriptionEngineSummary,
|
||||
} from './transcription-engine-section';
|
||||
import { StreamingConfigSection, getStreamingConfigSummary } from './streaming-config-section';
|
||||
|
||||
interface AdvancedLocalAISettingsProps {
|
||||
@@ -62,8 +65,7 @@ export function AdvancedLocalAISettings({ serverInfo }: AdvancedLocalAISettingsP
|
||||
|
||||
const isLoading = isAsrLoading || isHfLoading || isStreamingLoading;
|
||||
const diarizationEnabled = serverInfo?.diarization_enabled ?? false;
|
||||
const needsTokenWarning =
|
||||
diarizationEnabled && hfStatus !== null && !hfStatus.isConfigured;
|
||||
const needsTokenWarning = diarizationEnabled && hfStatus !== null && !hfStatus.isConfigured;
|
||||
const needsValidationWarning =
|
||||
diarizationEnabled && hfStatus !== null && hfStatus.isConfigured && !hfStatus.isValidated;
|
||||
|
||||
|
||||
@@ -6,18 +6,12 @@
|
||||
*/
|
||||
|
||||
import { useState } from 'react';
|
||||
import {
|
||||
CheckCircle2,
|
||||
Download,
|
||||
Eye,
|
||||
EyeOff,
|
||||
Key,
|
||||
Loader2,
|
||||
} from 'lucide-react';
|
||||
import { CheckCircle2, Download, Eye, EyeOff, Key, Loader2 } from 'lucide-react';
|
||||
import type { HuggingFaceTokenStatus } from '@/api/types';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Input } from '@/components/ui/input';
|
||||
import { Label } from '@/components/ui/label';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
import { EXTERNAL_LINKS } from '@/lib/config/provider-endpoints';
|
||||
import { EXTERNAL_LINK_REL, iconSize } from '@/lib/styles';
|
||||
import { cn } from '@/lib/utils';
|
||||
@@ -52,6 +46,7 @@ export function ModelAuthSection({
|
||||
const success = await setToken(tokenInput.trim(), true);
|
||||
if (success) {
|
||||
setTokenInput('');
|
||||
toast({ title: 'Token saved' });
|
||||
}
|
||||
};
|
||||
|
||||
@@ -140,11 +135,7 @@ export function ModelAuthSection({
|
||||
disabled={!tokenInput.trim() || isSaving}
|
||||
size="default"
|
||||
>
|
||||
{isSaving ? (
|
||||
<Loader2 className={cn(iconSize.sm, 'animate-spin')} />
|
||||
) : (
|
||||
'Save'
|
||||
)}
|
||||
{isSaving ? <Loader2 className={cn(iconSize.sm, 'animate-spin')} /> : 'Save'}
|
||||
</Button>
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
@@ -164,12 +155,7 @@ export function ModelAuthSection({
|
||||
{/* Action buttons for existing token */}
|
||||
{status?.isConfigured && (
|
||||
<div className="flex gap-2">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={handleValidate}
|
||||
disabled={isValidating}
|
||||
>
|
||||
<Button variant="outline" size="sm" onClick={handleValidate} disabled={isValidating}>
|
||||
{isValidating ? (
|
||||
<>
|
||||
<Loader2 className={cn(iconSize.sm, 'mr-2 animate-spin')} />
|
||||
|
||||
@@ -93,10 +93,8 @@ export function ResourceFitPanel({
|
||||
<div className="space-y-1 text-xs text-muted-foreground">
|
||||
<p>
|
||||
Estimate:{' '}
|
||||
<span className={typography.fontMedium}>
|
||||
{getModelSizeLabel(modelSizeValue)}
|
||||
</span>{' '}
|
||||
• <span>{COMPUTE_TYPE_LABELS[computeTypeForEstimate]}</span> • Speed:{' '}
|
||||
<span className={typography.fontMedium}>{getModelSizeLabel(modelSizeValue)}</span> •{' '}
|
||||
<span>{COMPUTE_TYPE_LABELS[computeTypeForEstimate]}</span> • Speed:{' '}
|
||||
{resourceEstimate.speed} • Accuracy: {resourceEstimate.accuracy}
|
||||
</p>
|
||||
{effectiveComputeType === 'unspecified' && (
|
||||
|
||||
@@ -5,10 +5,7 @@
|
||||
*/
|
||||
|
||||
import { useState } from 'react';
|
||||
import type {
|
||||
StreamingConfiguration,
|
||||
UpdateStreamingConfigurationRequest,
|
||||
} from '@/api/types';
|
||||
import type { StreamingConfiguration, UpdateStreamingConfigurationRequest } from '@/api/types';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Input } from '@/components/ui/input';
|
||||
import { Label } from '@/components/ui/label';
|
||||
@@ -100,46 +97,29 @@ export function StreamingConfigSection({
|
||||
const [pendingLeadingBuffer, setPendingLeadingBuffer] = useState<number | null>(null);
|
||||
|
||||
if (!config) {
|
||||
return (
|
||||
<p className="text-sm text-muted-foreground">
|
||||
Streaming configuration not available.
|
||||
</p>
|
||||
);
|
||||
return <p className="text-sm text-muted-foreground">Streaming configuration not available.</p>;
|
||||
}
|
||||
|
||||
const effectivePartialCadence =
|
||||
pendingPartialCadence ?? config.partialCadenceSeconds;
|
||||
const effectiveMinPartialAudio =
|
||||
pendingMinPartialAudio ?? config.minPartialAudioSeconds;
|
||||
const effectiveMaxSegmentDuration =
|
||||
pendingMaxSegmentDuration ?? config.maxSegmentDurationSeconds;
|
||||
const effectiveMinSpeechDuration =
|
||||
pendingMinSpeechDuration ?? config.minSpeechDurationSeconds;
|
||||
const effectiveTrailingSilence =
|
||||
pendingTrailingSilence ?? config.trailingSilenceSeconds;
|
||||
const effectiveLeadingBuffer =
|
||||
pendingLeadingBuffer ?? config.leadingBufferSeconds;
|
||||
const effectivePartialCadence = pendingPartialCadence ?? config.partialCadenceSeconds;
|
||||
const effectiveMinPartialAudio = pendingMinPartialAudio ?? config.minPartialAudioSeconds;
|
||||
const effectiveMaxSegmentDuration = pendingMaxSegmentDuration ?? config.maxSegmentDurationSeconds;
|
||||
const effectiveMinSpeechDuration = pendingMinSpeechDuration ?? config.minSpeechDurationSeconds;
|
||||
const effectiveTrailingSilence = pendingTrailingSilence ?? config.trailingSilenceSeconds;
|
||||
const effectiveLeadingBuffer = pendingLeadingBuffer ?? config.leadingBufferSeconds;
|
||||
|
||||
const hasChanges =
|
||||
(pendingPartialCadence !== null &&
|
||||
pendingPartialCadence !== config.partialCadenceSeconds) ||
|
||||
(pendingMinPartialAudio !== null &&
|
||||
pendingMinPartialAudio !== config.minPartialAudioSeconds) ||
|
||||
(pendingPartialCadence !== null && pendingPartialCadence !== config.partialCadenceSeconds) ||
|
||||
(pendingMinPartialAudio !== null && pendingMinPartialAudio !== config.minPartialAudioSeconds) ||
|
||||
(pendingMaxSegmentDuration !== null &&
|
||||
pendingMaxSegmentDuration !== config.maxSegmentDurationSeconds) ||
|
||||
(pendingMinSpeechDuration !== null &&
|
||||
pendingMinSpeechDuration !== config.minSpeechDurationSeconds) ||
|
||||
(pendingTrailingSilence !== null &&
|
||||
pendingTrailingSilence !== config.trailingSilenceSeconds) ||
|
||||
(pendingLeadingBuffer !== null &&
|
||||
pendingLeadingBuffer !== config.leadingBufferSeconds);
|
||||
(pendingTrailingSilence !== null && pendingTrailingSilence !== config.trailingSilenceSeconds) ||
|
||||
(pendingLeadingBuffer !== null && pendingLeadingBuffer !== config.leadingBufferSeconds);
|
||||
|
||||
const handleApply = async () => {
|
||||
const request: UpdateStreamingConfigurationRequest = {};
|
||||
if (
|
||||
pendingPartialCadence !== null &&
|
||||
pendingPartialCadence !== config.partialCadenceSeconds
|
||||
) {
|
||||
if (pendingPartialCadence !== null && pendingPartialCadence !== config.partialCadenceSeconds) {
|
||||
request.partialCadenceSeconds = pendingPartialCadence;
|
||||
}
|
||||
if (
|
||||
@@ -166,10 +146,7 @@ export function StreamingConfigSection({
|
||||
) {
|
||||
request.trailingSilenceSeconds = pendingTrailingSilence;
|
||||
}
|
||||
if (
|
||||
pendingLeadingBuffer !== null &&
|
||||
pendingLeadingBuffer !== config.leadingBufferSeconds
|
||||
) {
|
||||
if (pendingLeadingBuffer !== null && pendingLeadingBuffer !== config.leadingBufferSeconds) {
|
||||
request.leadingBufferSeconds = pendingLeadingBuffer;
|
||||
}
|
||||
|
||||
@@ -236,9 +213,7 @@ export function StreamingConfigSection({
|
||||
min={LIMITS.partialCadenceSeconds.min}
|
||||
max={LIMITS.partialCadenceSeconds.max}
|
||||
value={effectivePartialCadence}
|
||||
onChange={(e) =>
|
||||
setPendingPartialCadence(parseNumber(e.target.value))
|
||||
}
|
||||
onChange={(e) => setPendingPartialCadence(parseNumber(e.target.value))}
|
||||
disabled={isUpdating}
|
||||
/>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
@@ -257,9 +232,7 @@ export function StreamingConfigSection({
|
||||
min={LIMITS.minPartialAudioSeconds.min}
|
||||
max={LIMITS.minPartialAudioSeconds.max}
|
||||
value={effectiveMinPartialAudio}
|
||||
onChange={(e) =>
|
||||
setPendingMinPartialAudio(parseNumber(e.target.value))
|
||||
}
|
||||
onChange={(e) => setPendingMinPartialAudio(parseNumber(e.target.value))}
|
||||
disabled={isUpdating}
|
||||
/>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
@@ -278,9 +251,7 @@ export function StreamingConfigSection({
|
||||
min={LIMITS.maxSegmentDurationSeconds.min}
|
||||
max={LIMITS.maxSegmentDurationSeconds.max}
|
||||
value={effectiveMaxSegmentDuration}
|
||||
onChange={(e) =>
|
||||
setPendingMaxSegmentDuration(parseNumber(e.target.value))
|
||||
}
|
||||
onChange={(e) => setPendingMaxSegmentDuration(parseNumber(e.target.value))}
|
||||
disabled={isUpdating}
|
||||
/>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
@@ -299,9 +270,7 @@ export function StreamingConfigSection({
|
||||
min={LIMITS.minSpeechDurationSeconds.min}
|
||||
max={LIMITS.minSpeechDurationSeconds.max}
|
||||
value={effectiveMinSpeechDuration}
|
||||
onChange={(e) =>
|
||||
setPendingMinSpeechDuration(parseNumber(e.target.value))
|
||||
}
|
||||
onChange={(e) => setPendingMinSpeechDuration(parseNumber(e.target.value))}
|
||||
disabled={isUpdating}
|
||||
/>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
@@ -320,14 +289,10 @@ export function StreamingConfigSection({
|
||||
min={LIMITS.trailingSilenceSeconds.min}
|
||||
max={LIMITS.trailingSilenceSeconds.max}
|
||||
value={effectiveTrailingSilence}
|
||||
onChange={(e) =>
|
||||
setPendingTrailingSilence(parseNumber(e.target.value))
|
||||
}
|
||||
onChange={(e) => setPendingTrailingSilence(parseNumber(e.target.value))}
|
||||
disabled={isUpdating}
|
||||
/>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Silence to include after speech ends.
|
||||
</p>
|
||||
<p className="text-xs text-muted-foreground">Silence to include after speech ends.</p>
|
||||
</div>
|
||||
|
||||
<div className="space-y-2">
|
||||
@@ -341,14 +306,10 @@ export function StreamingConfigSection({
|
||||
min={LIMITS.leadingBufferSeconds.min}
|
||||
max={LIMITS.leadingBufferSeconds.max}
|
||||
value={effectiveLeadingBuffer}
|
||||
onChange={(e) =>
|
||||
setPendingLeadingBuffer(parseNumber(e.target.value))
|
||||
}
|
||||
onChange={(e) => setPendingLeadingBuffer(parseNumber(e.target.value))}
|
||||
disabled={isUpdating}
|
||||
/>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Audio to prepend before speech begins.
|
||||
</p>
|
||||
<p className="text-xs text-muted-foreground">Audio to prepend before speech begins.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -105,10 +105,7 @@ export function TranscriptionEngineSection({
|
||||
() => (config ? getComputeTypesForDevice(effectiveDevice, config) : []),
|
||||
[effectiveDevice, config]
|
||||
);
|
||||
const resourceSnapshot = useMemo(
|
||||
() => getResourceSnapshot(serverInfo),
|
||||
[serverInfo]
|
||||
);
|
||||
const resourceSnapshot = useMemo(() => getResourceSnapshot(serverInfo), [serverInfo]);
|
||||
const suggestedComputeType = useMemo(
|
||||
() =>
|
||||
getSuggestedComputeType(
|
||||
@@ -135,10 +132,7 @@ export function TranscriptionEngineSection({
|
||||
[effectiveDevice, sortedModelSizes, computeTypeForEstimate, resourceSnapshot]
|
||||
);
|
||||
const resourceEstimate = useMemo(
|
||||
() =>
|
||||
modelSizeValue
|
||||
? estimateResourceUsage(modelSizeValue, computeTypeForEstimate)
|
||||
: null,
|
||||
() => (modelSizeValue ? estimateResourceUsage(modelSizeValue, computeTypeForEstimate) : null),
|
||||
[modelSizeValue, computeTypeForEstimate]
|
||||
);
|
||||
const deviceValue = effectiveDevice === 'unspecified' ? '' : effectiveDevice;
|
||||
@@ -149,10 +143,7 @@ export function TranscriptionEngineSection({
|
||||
const recommendedComputeType = suggestedComputeType ?? computeTypeForEstimate;
|
||||
|
||||
useEffect(() => {
|
||||
if (
|
||||
pendingComputeType !== null &&
|
||||
!availableComputeTypes.includes(pendingComputeType)
|
||||
) {
|
||||
if (pendingComputeType !== null && !availableComputeTypes.includes(pendingComputeType)) {
|
||||
setPendingComputeType(null);
|
||||
}
|
||||
}, [availableComputeTypes, pendingComputeType]);
|
||||
@@ -253,18 +244,10 @@ export function TranscriptionEngineSection({
|
||||
<Label htmlFor="device" className="text-sm">
|
||||
Device
|
||||
</Label>
|
||||
<Select
|
||||
value={deviceValue}
|
||||
onValueChange={handleDeviceChange}
|
||||
disabled={isReconfiguring}
|
||||
>
|
||||
<Select value={deviceValue} onValueChange={handleDeviceChange} disabled={isReconfiguring}>
|
||||
<SelectTrigger id="device">
|
||||
<SelectValue
|
||||
placeholder={
|
||||
effectiveDevice === 'unspecified'
|
||||
? 'Server default'
|
||||
: 'Select device'
|
||||
}
|
||||
placeholder={effectiveDevice === 'unspecified' ? 'Server default' : 'Select device'}
|
||||
/>
|
||||
</SelectTrigger>
|
||||
<SelectContent>
|
||||
@@ -346,11 +329,7 @@ export function TranscriptionEngineSection({
|
||||
|
||||
{/* Action buttons */}
|
||||
<div className="flex items-center gap-2 pt-2">
|
||||
<Button
|
||||
onClick={handleApplyClick}
|
||||
disabled={!hasChanges || isReconfiguring}
|
||||
size="sm"
|
||||
>
|
||||
<Button onClick={handleApplyClick} disabled={!hasChanges || isReconfiguring} size="sm">
|
||||
{isReconfiguring ? (
|
||||
<>
|
||||
<Loader2 className={cn(iconSize.sm, 'mr-2 animate-spin')} />
|
||||
|
||||
@@ -31,9 +31,5 @@ export function useCachedModelCatalog<T extends ProviderConfig>(
|
||||
models_last_updated: cachedUpdate.updatedAt,
|
||||
models_source: 'cache',
|
||||
});
|
||||
}, [
|
||||
config,
|
||||
configType,
|
||||
setConfig,
|
||||
]);
|
||||
}, [config, configType, setConfig]);
|
||||
}
|
||||
|
||||
@@ -2,11 +2,7 @@
|
||||
* Shared helpers for AI configuration model catalogs.
|
||||
*/
|
||||
|
||||
import type {
|
||||
AIProviderConfig,
|
||||
ModelCatalogEntry,
|
||||
TranscriptionProviderConfig,
|
||||
} from '@/api/types';
|
||||
import type { AIProviderConfig, ModelCatalogEntry, TranscriptionProviderConfig } from '@/api/types';
|
||||
import { getCachedModelCatalog } from '@/lib/ai-providers/model-catalog-cache';
|
||||
|
||||
export type ConfigType = 'transcription' | 'summary' | 'embedding';
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
// AI Configuration Settings Section
|
||||
|
||||
import { motion } from 'framer-motion';
|
||||
import { Brain } from 'lucide-react';
|
||||
import { AlertTriangle, Brain } from 'lucide-react';
|
||||
import { useEffect, useState } from 'react';
|
||||
import type {
|
||||
AIProviderConfig,
|
||||
@@ -10,6 +10,7 @@ import type {
|
||||
TranscriptionProviderType,
|
||||
} from '@/api/types';
|
||||
import { Accordion } from '@/components/ui/accordion';
|
||||
import { Alert, AlertDescription, AlertTitle } from '@/components/ui/alert';
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { addClientLog } from '@/lib/client-logs';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
@@ -236,87 +237,88 @@ export function AIConfigSection() {
|
||||
preferences.updateAIConfig(configType, { selected_model: model, available_models: nextModels });
|
||||
};
|
||||
|
||||
const createFetchModelsHandler = (configType: ConfigType) => async (
|
||||
forceRefresh: boolean = false
|
||||
) => {
|
||||
const config =
|
||||
configType === 'transcription'
|
||||
? transcriptionConfig
|
||||
: configType === 'summary'
|
||||
? summaryConfig
|
||||
: embeddingConfig;
|
||||
const setFetching =
|
||||
configType === 'transcription'
|
||||
? setFetchingTranscriptionModels
|
||||
: configType === 'summary'
|
||||
? setFetchingSummaryModels
|
||||
: setFetchingEmbeddingModels;
|
||||
const createFetchModelsHandler =
|
||||
(configType: ConfigType) =>
|
||||
async (forceRefresh: boolean = false) => {
|
||||
const config =
|
||||
configType === 'transcription'
|
||||
? transcriptionConfig
|
||||
: configType === 'summary'
|
||||
? summaryConfig
|
||||
: embeddingConfig;
|
||||
const setFetching =
|
||||
configType === 'transcription'
|
||||
? setFetchingTranscriptionModels
|
||||
: configType === 'summary'
|
||||
? setFetchingSummaryModels
|
||||
: setFetchingEmbeddingModels;
|
||||
|
||||
setFetching(true);
|
||||
try {
|
||||
const result = await fetchProviderModels(
|
||||
config.provider,
|
||||
config.base_url,
|
||||
config.api_key,
|
||||
configType,
|
||||
{ forceRefresh }
|
||||
);
|
||||
if (result.success) {
|
||||
const nextModels = ensureSelectedModel(result.models, config.selected_model);
|
||||
const modelsLastUpdated = result.updatedAt ?? config.models_last_updated ?? null;
|
||||
const modelsSource = result.source ?? config.models_source ?? null;
|
||||
if (configType === 'transcription') {
|
||||
setTranscriptionConfig((prev) => ({
|
||||
...prev,
|
||||
setFetching(true);
|
||||
try {
|
||||
const result = await fetchProviderModels(
|
||||
config.provider,
|
||||
config.base_url,
|
||||
config.api_key,
|
||||
configType,
|
||||
{ forceRefresh }
|
||||
);
|
||||
if (result.success) {
|
||||
const nextModels = ensureSelectedModel(result.models, config.selected_model);
|
||||
const modelsLastUpdated = result.updatedAt ?? config.models_last_updated ?? null;
|
||||
const modelsSource = result.source ?? config.models_source ?? null;
|
||||
if (configType === 'transcription') {
|
||||
setTranscriptionConfig((prev) => ({
|
||||
...prev,
|
||||
available_models: nextModels,
|
||||
models_last_updated: modelsLastUpdated,
|
||||
models_source: modelsSource,
|
||||
}));
|
||||
} else if (configType === 'summary') {
|
||||
setSummaryConfig((prev) => ({
|
||||
...prev,
|
||||
available_models: nextModels,
|
||||
models_last_updated: modelsLastUpdated,
|
||||
models_source: modelsSource,
|
||||
}));
|
||||
} else {
|
||||
setEmbeddingConfig((prev) => ({
|
||||
...prev,
|
||||
available_models: nextModels,
|
||||
models_last_updated: modelsLastUpdated,
|
||||
models_source: modelsSource,
|
||||
}));
|
||||
}
|
||||
preferences.updateAIConfig(configType, {
|
||||
available_models: nextModels,
|
||||
models_last_updated: modelsLastUpdated,
|
||||
models_source: modelsSource,
|
||||
}));
|
||||
} else if (configType === 'summary') {
|
||||
setSummaryConfig((prev) => ({
|
||||
...prev,
|
||||
available_models: nextModels,
|
||||
models_last_updated: modelsLastUpdated,
|
||||
models_source: modelsSource,
|
||||
}));
|
||||
});
|
||||
const sourceLabel = result.source === 'cache' ? 'Loaded from cache' : 'Loaded from API';
|
||||
const forceLabel = forceRefresh ? ' (forced refresh)' : '';
|
||||
const description =
|
||||
result.error || `${sourceLabel}${forceLabel} • ${nextModels.length} models`;
|
||||
toast({
|
||||
title: result.stale ? 'Models loaded (stale cache)' : 'Models loaded',
|
||||
description,
|
||||
variant: result.stale ? 'destructive' : 'default',
|
||||
});
|
||||
} else {
|
||||
setEmbeddingConfig((prev) => ({
|
||||
...prev,
|
||||
available_models: nextModels,
|
||||
models_last_updated: modelsLastUpdated,
|
||||
models_source: modelsSource,
|
||||
}));
|
||||
toast({
|
||||
title: 'Failed to fetch models',
|
||||
description: result.error,
|
||||
variant: 'destructive',
|
||||
});
|
||||
}
|
||||
preferences.updateAIConfig(configType, {
|
||||
available_models: nextModels,
|
||||
models_last_updated: modelsLastUpdated,
|
||||
models_source: modelsSource,
|
||||
});
|
||||
const sourceLabel = result.source === 'cache' ? 'Loaded from cache' : 'Loaded from API';
|
||||
const forceLabel = forceRefresh ? ' (forced refresh)' : '';
|
||||
const description = result.error || `${sourceLabel}${forceLabel} • ${nextModels.length} models`;
|
||||
toast({
|
||||
title: result.stale ? 'Models loaded (stale cache)' : 'Models loaded',
|
||||
description,
|
||||
variant: result.stale ? 'destructive' : 'default',
|
||||
});
|
||||
} else {
|
||||
toast({
|
||||
} catch (error) {
|
||||
toastError({
|
||||
title: 'Failed to fetch models',
|
||||
description: result.error,
|
||||
variant: 'destructive',
|
||||
error,
|
||||
fallback: 'Unknown error',
|
||||
});
|
||||
} finally {
|
||||
setFetching(false);
|
||||
}
|
||||
} catch (error) {
|
||||
toastError({
|
||||
title: 'Failed to fetch models',
|
||||
error,
|
||||
fallback: 'Unknown error',
|
||||
});
|
||||
} finally {
|
||||
setFetching(false);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
useCachedModelCatalog('transcription', transcriptionConfig, setTranscriptionConfig);
|
||||
useCachedModelCatalog('summary', summaryConfig, setSummaryConfig);
|
||||
@@ -416,6 +418,16 @@ export function AIConfigSection() {
|
||||
</div>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
{!encryptionAvailable && (
|
||||
<Alert className="mb-4">
|
||||
<AlertTriangle className="h-4 w-4" />
|
||||
<AlertTitle>Secure storage unavailable</AlertTitle>
|
||||
<AlertDescription>
|
||||
API keys will not be persisted in this environment. Use the desktop app or a
|
||||
secure (HTTPS) origin to enable encrypted storage.
|
||||
</AlertDescription>
|
||||
</Alert>
|
||||
)}
|
||||
<Accordion type="multiple" className="space-y-3">
|
||||
<ProviderConfigCard
|
||||
title="Transcription"
|
||||
@@ -434,10 +446,7 @@ export function AIConfigSection() {
|
||||
'transcription',
|
||||
transcriptionConfig
|
||||
)}
|
||||
manualModelHint={getManualModelHint(
|
||||
'transcription',
|
||||
transcriptionConfig.provider
|
||||
)}
|
||||
manualModelHint={getManualModelHint('transcription', transcriptionConfig.provider)}
|
||||
/>
|
||||
|
||||
<ProviderConfigCard
|
||||
|
||||
@@ -3,12 +3,7 @@ import { AlertCircle, Loader2, Mic, Play, RefreshCw, Square, Volume2 } from 'luc
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { Label } from '@/components/ui/label';
|
||||
import {
|
||||
Select,
|
||||
SelectContent,
|
||||
SelectItem,
|
||||
SelectTrigger,
|
||||
} from '@/components/ui/select';
|
||||
import { Select, SelectContent, SelectItem, SelectTrigger } from '@/components/ui/select';
|
||||
import { Separator } from '@/components/ui/separator';
|
||||
import { Slider } from '@/components/ui/slider';
|
||||
import { Switch } from '@/components/ui/switch';
|
||||
@@ -296,7 +291,8 @@ export function AudioDevicesSection({
|
||||
</Button>
|
||||
</div>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
This only affects recording. App playback uses the system default output.
|
||||
This only affects recording. App playback uses the system default
|
||||
output.
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
import { motion } from 'framer-motion';
|
||||
import { AlertCircle, Loader2, RefreshCw, Stethoscope, X } from 'lucide-react';
|
||||
import { useCallback, useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import type { ConnectionDiagnostics, DiagnosticStep } from '@/api/types';
|
||||
import { SuccessIcon } from '@/components/icons/status-icons';
|
||||
import { Button } from '@/components/ui/button';
|
||||
|
||||
@@ -1,13 +1,6 @@
|
||||
import { motion } from 'framer-motion';
|
||||
import { useEffect, useState } from 'react';
|
||||
import {
|
||||
Briefcase,
|
||||
Code,
|
||||
FileText,
|
||||
FolderOpen,
|
||||
MessageSquare,
|
||||
Sparkles,
|
||||
} from 'lucide-react';
|
||||
import { Briefcase, Code, FileText, FolderOpen, MessageSquare, Sparkles } from 'lucide-react';
|
||||
import type { AIFormat, AITone, AIVerbosity, ExportFormat } from '@/api/types';
|
||||
import { isTauriEnvironment } from '@/api/tauri-adapter';
|
||||
import { Button } from '@/components/ui/button';
|
||||
@@ -98,7 +91,8 @@ const detectOSFromTauriPath = (path: string): DetectedOS => {
|
||||
};
|
||||
|
||||
const getExportLocationPlaceholder = (os: DetectedOS, tauriLocation: string | null): string =>
|
||||
tauriLocation ?? (os === 'windows' ? 'C:\\Users\\<you>\\Documents\\NoteFlow' : '~/Documents/NoteFlow');
|
||||
tauriLocation ??
|
||||
(os === 'windows' ? 'C:\\Users\\<you>\\Documents\\NoteFlow' : '~/Documents/NoteFlow');
|
||||
|
||||
const getExportLocationHint = (os: DetectedOS, tauriLocation: string | null): string => {
|
||||
if (tauriLocation) {
|
||||
@@ -298,7 +292,8 @@ export function ExportAISection({
|
||||
</SelectContent>
|
||||
</Select>
|
||||
<p className="text-xs text-muted-foreground">
|
||||
Presets use tone, format, and verbosity. Custom templates override the default summary prompt.
|
||||
Presets use tone, format, and verbosity. Custom templates override the default
|
||||
summary prompt.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -155,7 +155,11 @@ interface TestAllButtonProps {
|
||||
export function TestAllButton({ onTest, isTesting }: TestAllButtonProps) {
|
||||
return (
|
||||
<Button variant="outline" size="sm" onClick={onTest} disabled={isTesting}>
|
||||
{isTesting ? <Loader2 className={iconWithMargin.mdSpin} /> : <Zap className={iconWithMargin.md} />}
|
||||
{isTesting ? (
|
||||
<Loader2 className={iconWithMargin.mdSpin} />
|
||||
) : (
|
||||
<Zap className={iconWithMargin.md} />
|
||||
)}
|
||||
Test All
|
||||
</Button>
|
||||
);
|
||||
|
||||
@@ -6,14 +6,14 @@
|
||||
|
||||
import { motion } from 'framer-motion';
|
||||
import { Box, Link2 } from 'lucide-react';
|
||||
import { useState } from 'react';
|
||||
import { useCallback, useMemo, useState } from 'react';
|
||||
|
||||
import type { SyncState } from '@/hooks/use-integration-sync';
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from '@/components/ui/tabs';
|
||||
|
||||
import { CustomIntegrationDialog, TestAllButton } from './custom-integration-dialog';
|
||||
import { groupIntegrationsByType } from './helpers';
|
||||
import { IntegrationSettingsProvider } from './integration-settings-context';
|
||||
import { IntegrationItem } from './integration-item';
|
||||
import type { IntegrationsSectionProps } from './types';
|
||||
import { useIntegrationHandlers } from './use-integration-handlers';
|
||||
@@ -42,103 +42,122 @@ export function IntegrationsSection({
|
||||
handleTestAllIntegrations,
|
||||
} = useIntegrationHandlers({ integrations, setIntegrations });
|
||||
|
||||
const handleTestIntegrationWithState = useCallback(
|
||||
(integration: Parameters<typeof handleTestIntegration>[0]) =>
|
||||
handleTestIntegration(integration, setTestingIntegration),
|
||||
[handleTestIntegration, setTestingIntegration]
|
||||
);
|
||||
|
||||
const contextValue = useMemo(
|
||||
() => ({
|
||||
syncStates,
|
||||
triggerSync,
|
||||
testingIntegrationId: testingIntegration,
|
||||
testIntegration: handleTestIntegrationWithState,
|
||||
oauthState,
|
||||
resetOAuth,
|
||||
pendingOAuthIntegrationIdRef,
|
||||
toggleIntegration: handleIntegrationToggle,
|
||||
calendarConnect: handleCalendarConnect,
|
||||
calendarDisconnect: handleCalendarDisconnect,
|
||||
updateIntegrationConfig: handleUpdateIntegrationConfig,
|
||||
removeIntegration: handleRemoveIntegration,
|
||||
}),
|
||||
[
|
||||
syncStates,
|
||||
triggerSync,
|
||||
testingIntegration,
|
||||
handleTestIntegrationWithState,
|
||||
oauthState,
|
||||
resetOAuth,
|
||||
pendingOAuthIntegrationIdRef,
|
||||
handleIntegrationToggle,
|
||||
handleCalendarConnect,
|
||||
handleCalendarDisconnect,
|
||||
handleUpdateIntegrationConfig,
|
||||
handleRemoveIntegration,
|
||||
]
|
||||
);
|
||||
|
||||
const groupedIntegrations = groupIntegrationsByType(integrations);
|
||||
|
||||
return (
|
||||
<motion.div
|
||||
initial={{ opacity: 0, y: 20 }}
|
||||
animate={{ opacity: 1, y: 0 }}
|
||||
transition={{ delay: 0.32 }}
|
||||
>
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex items-center gap-3">
|
||||
<div className="h-10 w-10 rounded-lg bg-primary/10 flex items-center justify-center">
|
||||
<Link2 className="h-5 w-5 text-primary" />
|
||||
<IntegrationSettingsProvider value={contextValue}>
|
||||
<motion.div
|
||||
initial={{ opacity: 0, y: 20 }}
|
||||
animate={{ opacity: 1, y: 0 }}
|
||||
transition={{ delay: 0.32 }}
|
||||
>
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<div className="flex items-center justify-between">
|
||||
<div className="flex items-center gap-3">
|
||||
<div className="h-10 w-10 rounded-lg bg-primary/10 flex items-center justify-center">
|
||||
<Link2 className="h-5 w-5 text-primary" />
|
||||
</div>
|
||||
<div>
|
||||
<CardTitle className="text-lg">Integrations</CardTitle>
|
||||
<CardDescription>Connect external services and tools</CardDescription>
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<CardTitle className="text-lg">Integrations</CardTitle>
|
||||
<CardDescription>Connect external services and tools</CardDescription>
|
||||
<div className="flex gap-2">
|
||||
<TestAllButton
|
||||
onTest={() => handleTestAllIntegrations(setTestingAllIntegrations)}
|
||||
isTesting={testingAllIntegrations}
|
||||
/>
|
||||
<CustomIntegrationDialog
|
||||
onAdd={(formState, onClose) => handleAddCustomIntegration(formState, onClose)}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="flex gap-2">
|
||||
<TestAllButton
|
||||
onTest={() => handleTestAllIntegrations(setTestingAllIntegrations)}
|
||||
isTesting={testingAllIntegrations}
|
||||
/>
|
||||
<CustomIntegrationDialog
|
||||
onAdd={(formState, onClose) => handleAddCustomIntegration(formState, onClose)}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<Tabs defaultValue="auth" className="w-full">
|
||||
<TabsList className="grid w-full grid-cols-6">
|
||||
<TabsTrigger value="auth" className="text-xs">
|
||||
Auth/SSO
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="email" className="text-xs">
|
||||
Email
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="calendar" className="text-xs">
|
||||
Calendar
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="pkm" className="text-xs">
|
||||
PKM
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="oidc" className="text-xs">
|
||||
OIDC
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="custom" className="text-xs">
|
||||
Custom
|
||||
</TabsTrigger>
|
||||
</TabsList>
|
||||
</CardHeader>
|
||||
<CardContent>
|
||||
<Tabs defaultValue="auth" className="w-full">
|
||||
<TabsList className="grid w-full grid-cols-6">
|
||||
<TabsTrigger value="auth" className="text-xs">
|
||||
Auth/SSO
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="email" className="text-xs">
|
||||
Email
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="calendar" className="text-xs">
|
||||
Calendar
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="pkm" className="text-xs">
|
||||
PKM
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="oidc" className="text-xs">
|
||||
OIDC
|
||||
</TabsTrigger>
|
||||
<TabsTrigger value="custom" className="text-xs">
|
||||
Custom
|
||||
</TabsTrigger>
|
||||
</TabsList>
|
||||
|
||||
{Object.entries(groupedIntegrations).map(([type, items]) => (
|
||||
<TabsContent key={type} value={type} className="space-y-3 mt-4">
|
||||
{items.length === 0 ? (
|
||||
<div className="text-center py-8 text-muted-foreground">
|
||||
<Box className="h-8 w-8 mx-auto mb-2 opacity-50" />
|
||||
<p className="text-sm">No {type} integrations configured</p>
|
||||
</div>
|
||||
) : (
|
||||
items.map((integration) => {
|
||||
const integrationSyncState: SyncState = syncStates[integration.id] ?? {
|
||||
status: 'idle',
|
||||
lastSync: null,
|
||||
nextSync: null,
|
||||
};
|
||||
|
||||
return (
|
||||
{Object.entries(groupedIntegrations).map(([type, items]) => (
|
||||
<TabsContent key={type} value={type} className="space-y-3 mt-4">
|
||||
{items.length === 0 ? (
|
||||
<div className="text-center py-8 text-muted-foreground">
|
||||
<Box className="h-8 w-8 mx-auto mb-2 opacity-50" />
|
||||
<p className="text-sm">No {type} integrations configured</p>
|
||||
</div>
|
||||
) : (
|
||||
items.map((integration) => (
|
||||
<IntegrationItem
|
||||
key={integration.id}
|
||||
integration={integration}
|
||||
isExpanded={expandedIntegration === integration.id}
|
||||
onToggleExpand={setExpandedIntegration}
|
||||
syncState={integrationSyncState}
|
||||
triggerSync={triggerSync}
|
||||
testingIntegration={testingIntegration}
|
||||
oauthState={oauthState}
|
||||
onIntegrationToggle={handleIntegrationToggle}
|
||||
onCalendarConnect={handleCalendarConnect}
|
||||
onCalendarDisconnect={handleCalendarDisconnect}
|
||||
onUpdateConfig={handleUpdateIntegrationConfig}
|
||||
onTest={(i) => handleTestIntegration(i, setTestingIntegration)}
|
||||
onRemove={handleRemoveIntegration}
|
||||
onResetOAuth={resetOAuth}
|
||||
pendingOAuthRef={pendingOAuthIntegrationIdRef}
|
||||
/>
|
||||
);
|
||||
})
|
||||
)}
|
||||
</TabsContent>
|
||||
))}
|
||||
</Tabs>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</motion.div>
|
||||
))
|
||||
)}
|
||||
</TabsContent>
|
||||
))}
|
||||
</Tabs>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</motion.div>
|
||||
</IntegrationSettingsProvider>
|
||||
);
|
||||
}
|
||||
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
*/
|
||||
|
||||
import { AlertTriangle, Clock, Loader2, X } from 'lucide-react';
|
||||
import type { MutableRefObject } from 'react';
|
||||
|
||||
import type { Integration } from '@/api/types';
|
||||
import { IntegrationConfigPanel } from '@/components/integration-config-panel';
|
||||
@@ -12,51 +11,41 @@ import { Badge } from '@/components/ui/badge';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Collapsible, CollapsibleContent, CollapsibleTrigger } from '@/components/ui/collapsible';
|
||||
import { Switch } from '@/components/ui/switch';
|
||||
import type { OAuthFlowState } from '@/hooks/use-oauth-flow';
|
||||
import type { SyncState } from '@/hooks/use-integration-sync';
|
||||
import { formatTimestamp } from '@/lib/format';
|
||||
import { getIntegrationIcon, hasRequiredIntegrationFields } from '@/lib/integration-utils';
|
||||
import { iconWithMargin } from '@/lib/styles';
|
||||
|
||||
import { getCalendarProvider } from './helpers';
|
||||
import { useIntegrationSettingsContext } from './integration-settings-context';
|
||||
|
||||
interface IntegrationItemProps {
|
||||
integration: Integration;
|
||||
isExpanded: boolean;
|
||||
onToggleExpand: (id: string | null) => void;
|
||||
syncState: SyncState;
|
||||
triggerSync: (id: string) => void;
|
||||
testingIntegration: string | null;
|
||||
oauthState: OAuthFlowState;
|
||||
onIntegrationToggle: (integration: Integration) => void;
|
||||
onCalendarConnect: (integration: Integration) => void;
|
||||
onCalendarDisconnect: (integration: Integration) => void;
|
||||
onUpdateConfig: (id: string, config: Partial<Integration>) => void;
|
||||
onTest: (integration: Integration) => void;
|
||||
onRemove: (id: string) => void;
|
||||
onResetOAuth: () => void;
|
||||
pendingOAuthRef: MutableRefObject<string | null>;
|
||||
}
|
||||
|
||||
export function IntegrationItem({
|
||||
integration,
|
||||
isExpanded,
|
||||
onToggleExpand,
|
||||
syncState,
|
||||
triggerSync,
|
||||
testingIntegration,
|
||||
oauthState,
|
||||
onIntegrationToggle,
|
||||
onCalendarConnect,
|
||||
onCalendarDisconnect,
|
||||
onUpdateConfig,
|
||||
onTest,
|
||||
onRemove,
|
||||
onResetOAuth,
|
||||
pendingOAuthRef,
|
||||
}: IntegrationItemProps) {
|
||||
const {
|
||||
syncStates,
|
||||
triggerSync,
|
||||
testingIntegrationId,
|
||||
testIntegration,
|
||||
oauthState,
|
||||
resetOAuth,
|
||||
pendingOAuthIntegrationIdRef,
|
||||
toggleIntegration,
|
||||
calendarConnect,
|
||||
calendarDisconnect,
|
||||
updateIntegrationConfig,
|
||||
removeIntegration,
|
||||
} = useIntegrationSettingsContext();
|
||||
const Icon = getIntegrationIcon(integration.type);
|
||||
const calendarProvider = integration.type === 'calendar' ? getCalendarProvider(integration) : null;
|
||||
const calendarProvider =
|
||||
integration.type === 'calendar' ? getCalendarProvider(integration) : null;
|
||||
const isCalendarProviderSupported = Boolean(calendarProvider);
|
||||
const hasServerIntegration = Boolean(integration.integration_id);
|
||||
const hasRequiredFields = hasRequiredIntegrationFields(integration);
|
||||
@@ -67,18 +56,13 @@ export function IntegrationItem({
|
||||
oauthState.status === 'awaiting_callback' ||
|
||||
oauthState.status === 'completing');
|
||||
const isCalendarConnected =
|
||||
integration.type === 'calendar' &&
|
||||
integration.status === 'connected' &&
|
||||
hasServerIntegration;
|
||||
integration.type === 'calendar' && integration.status === 'connected' && hasServerIntegration;
|
||||
const isCredentialMismatch = integration.status === 'connected' && !hasRequiredFields;
|
||||
const displayStatus =
|
||||
isCredentialMismatch
|
||||
? 'error'
|
||||
: integration.type === 'calendar' &&
|
||||
integration.status === 'connected' &&
|
||||
!hasServerIntegration
|
||||
? 'disconnected'
|
||||
: integration.status;
|
||||
const displayStatus = isCredentialMismatch
|
||||
? 'error'
|
||||
: integration.type === 'calendar' && integration.status === 'connected' && !hasServerIntegration
|
||||
? 'disconnected'
|
||||
: integration.status;
|
||||
const isPkmSyncEnabled =
|
||||
integration.type === 'pkm' ? Boolean(integration.pkm_config?.sync_enabled) : true;
|
||||
const isSyncableConnected =
|
||||
@@ -88,6 +72,11 @@ export function IntegrationItem({
|
||||
hasServerIntegration &&
|
||||
hasRequiredFields;
|
||||
const canTestIntegration = integration.type !== 'calendar';
|
||||
const syncState = syncStates[integration.id] ?? {
|
||||
status: 'idle',
|
||||
lastSync: null,
|
||||
nextSync: null,
|
||||
};
|
||||
|
||||
return (
|
||||
<Collapsible
|
||||
@@ -142,15 +131,15 @@ export function IntegrationItem({
|
||||
variant={isCalendarConnected ? 'outline' : 'default'}
|
||||
size="sm"
|
||||
onClick={(event) => {
|
||||
event.stopPropagation();
|
||||
if (isCalendarConnected) {
|
||||
void onCalendarDisconnect(integration);
|
||||
} else {
|
||||
void onCalendarConnect(integration);
|
||||
}
|
||||
}}
|
||||
disabled={!isCalendarProviderSupported || isOAuthPending}
|
||||
>
|
||||
event.stopPropagation();
|
||||
if (isCalendarConnected) {
|
||||
void calendarDisconnect(integration);
|
||||
} else {
|
||||
void calendarConnect(integration);
|
||||
}
|
||||
}}
|
||||
disabled={!isCalendarProviderSupported || isOAuthPending}
|
||||
>
|
||||
{isOAuthPending ? <Loader2 className={iconWithMargin.mdSpin} /> : null}
|
||||
{isOAuthPending
|
||||
? oauthState.status === 'awaiting_callback'
|
||||
@@ -166,8 +155,8 @@ export function IntegrationItem({
|
||||
size="sm"
|
||||
onClick={(event) => {
|
||||
event.stopPropagation();
|
||||
onResetOAuth();
|
||||
pendingOAuthRef.current = null;
|
||||
resetOAuth();
|
||||
pendingOAuthIntegrationIdRef.current = null;
|
||||
}}
|
||||
>
|
||||
Cancel
|
||||
@@ -177,7 +166,7 @@ export function IntegrationItem({
|
||||
) : (
|
||||
<Switch
|
||||
checked={integration.status === 'connected'}
|
||||
onCheckedChange={() => onIntegrationToggle(integration)}
|
||||
onCheckedChange={() => toggleIntegration(integration)}
|
||||
onClick={(e) => e.stopPropagation()}
|
||||
/>
|
||||
)}
|
||||
@@ -188,7 +177,7 @@ export function IntegrationItem({
|
||||
className="h-8 w-8"
|
||||
onClick={(e) => {
|
||||
e.stopPropagation();
|
||||
onRemove(integration.id);
|
||||
removeIntegration(integration.id);
|
||||
}}
|
||||
>
|
||||
<X className="h-4 w-4" />
|
||||
@@ -213,9 +202,9 @@ export function IntegrationItem({
|
||||
)}
|
||||
<IntegrationConfigPanel
|
||||
integration={integration}
|
||||
onUpdate={(config) => onUpdateConfig(integration.id, config)}
|
||||
onTest={canTestIntegration ? () => onTest(integration) : undefined}
|
||||
isTesting={canTestIntegration && testingIntegration === integration.id}
|
||||
onUpdate={(config) => updateIntegrationConfig(integration.id, config)}
|
||||
onTest={canTestIntegration ? () => void testIntegration(integration) : undefined}
|
||||
isTesting={canTestIntegration && testingIntegrationId === integration.id}
|
||||
/>
|
||||
</div>
|
||||
</CollapsibleContent>
|
||||
|
||||
@@ -0,0 +1,45 @@
|
||||
import { createContext, useContext } from 'react';
|
||||
import type { MutableRefObject, ReactNode } from 'react';
|
||||
|
||||
import type { Integration } from '@/api/types';
|
||||
import type { OAuthFlowState } from '@/hooks/use-oauth-flow';
|
||||
import type { SyncState } from '@/hooks/use-integration-sync';
|
||||
|
||||
export interface IntegrationSettingsContextValue {
|
||||
syncStates: Record<string, SyncState>;
|
||||
triggerSync: (integrationId: string) => void;
|
||||
testingIntegrationId: string | null;
|
||||
testIntegration: (integration: Integration) => Promise<void>;
|
||||
oauthState: OAuthFlowState;
|
||||
resetOAuth: () => void;
|
||||
pendingOAuthIntegrationIdRef: MutableRefObject<string | null>;
|
||||
toggleIntegration: (integration: Integration) => void;
|
||||
calendarConnect: (integration: Integration) => void;
|
||||
calendarDisconnect: (integration: Integration) => void;
|
||||
updateIntegrationConfig: (integrationId: string, config: Partial<Integration>) => void;
|
||||
removeIntegration: (integrationId: string) => void;
|
||||
}
|
||||
|
||||
const IntegrationSettingsContext = createContext<IntegrationSettingsContextValue | null>(null);
|
||||
|
||||
export function IntegrationSettingsProvider({
|
||||
value,
|
||||
children,
|
||||
}: {
|
||||
value: IntegrationSettingsContextValue;
|
||||
children: ReactNode;
|
||||
}) {
|
||||
return (
|
||||
<IntegrationSettingsContext.Provider value={value}>
|
||||
{children}
|
||||
</IntegrationSettingsContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
export function useIntegrationSettingsContext(): IntegrationSettingsContextValue {
|
||||
const context = useContext(IntegrationSettingsContext);
|
||||
if (!context) {
|
||||
throw new Error('useIntegrationSettingsContext must be used within IntegrationSettingsProvider');
|
||||
}
|
||||
return context;
|
||||
}
|
||||
@@ -0,0 +1,319 @@
|
||||
import { act, renderHook } from '@testing-library/react';
|
||||
import { beforeEach, describe, expect, it, vi } from 'vitest';
|
||||
|
||||
import { DEFAULT_OIDC_CLAIM_MAPPING } from '@/api/types';
|
||||
import type { Integration } from '@/api/types';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
import { preferences } from '@/lib/preferences';
|
||||
import { useIntegrationHandlers } from './use-integration-handlers';
|
||||
|
||||
const integrationState = vi.hoisted(() => ({
|
||||
integrations: [] as Integration[],
|
||||
}));
|
||||
|
||||
const mockPreferences = vi.hoisted(() => ({
|
||||
getIntegrations: vi.fn(() => integrationState.integrations),
|
||||
updateIntegration: vi.fn((id: string, updates: Partial<Integration>) => {
|
||||
integrationState.integrations = integrationState.integrations.map((integration) =>
|
||||
integration.id === id ? { ...integration, ...updates } : integration
|
||||
);
|
||||
}),
|
||||
addCustomIntegration: vi.fn((name: string, config: Integration['webhook_config']) => {
|
||||
const id = `custom-${integrationState.integrations.length + 1}`;
|
||||
integrationState.integrations = [
|
||||
...integrationState.integrations,
|
||||
{
|
||||
id,
|
||||
name,
|
||||
type: 'custom',
|
||||
status: 'disconnected',
|
||||
webhook_config: config,
|
||||
},
|
||||
];
|
||||
}),
|
||||
removeIntegration: vi.fn((id: string) => {
|
||||
integrationState.integrations = integrationState.integrations.filter(
|
||||
(integration) => integration.id !== id
|
||||
);
|
||||
}),
|
||||
}));
|
||||
|
||||
const secureStorageState = vi.hoisted(() => ({
|
||||
available: true,
|
||||
}));
|
||||
|
||||
const secureSecretsState = vi.hoisted(() => ({
|
||||
saveSecrets: vi.fn(),
|
||||
clearSecrets: vi.fn(),
|
||||
}));
|
||||
|
||||
const oauthFlowState = vi.hoisted(() => ({
|
||||
state: {
|
||||
status: 'idle',
|
||||
provider: null,
|
||||
authUrl: null,
|
||||
error: null,
|
||||
connection: null,
|
||||
integrationId: null,
|
||||
},
|
||||
initiateAuth: vi.fn(),
|
||||
disconnect: vi.fn(),
|
||||
reset: vi.fn(),
|
||||
}));
|
||||
|
||||
const oidcProvidersState = vi.hoisted(() => ({
|
||||
createProvider: vi.fn(),
|
||||
updateProvider: vi.fn(),
|
||||
}));
|
||||
|
||||
const apiState = vi.hoisted(() => ({
|
||||
testOidcConnection: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('@/lib/preferences', () => ({
|
||||
preferences: mockPreferences,
|
||||
}));
|
||||
|
||||
vi.mock('@/lib/crypto', () => ({
|
||||
isSecureStorageAvailable: () => secureStorageState.available,
|
||||
}));
|
||||
|
||||
vi.mock('@/hooks/use-secure-integration-secrets', () => ({
|
||||
useSecureIntegrationSecrets: () => secureSecretsState,
|
||||
}));
|
||||
|
||||
vi.mock('@/hooks/use-oauth-flow', () => ({
|
||||
useOAuthFlow: () => oauthFlowState,
|
||||
}));
|
||||
|
||||
vi.mock('@/hooks/use-oidc-providers', () => ({
|
||||
useOidcProviders: () => oidcProvidersState,
|
||||
}));
|
||||
|
||||
vi.mock('@/api/interface', () => ({
|
||||
getAPI: () => apiState,
|
||||
}));
|
||||
|
||||
vi.mock('@/hooks/use-toast', () => ({
|
||||
toast: vi.fn(),
|
||||
}));
|
||||
|
||||
vi.mock('@/lib/error-reporting', () => ({
|
||||
toastError: vi.fn(() => 'Toast error'),
|
||||
}));
|
||||
|
||||
vi.mock('@/contexts/workspace-state', () => ({
|
||||
useWorkspace: () => ({ currentWorkspace: { id: 'workspace-1' } }),
|
||||
}));
|
||||
|
||||
function createIntegration(overrides: Partial<Integration>): Integration {
|
||||
return {
|
||||
id: 'integration-1',
|
||||
name: 'Test Integration',
|
||||
type: 'custom',
|
||||
status: 'disconnected',
|
||||
webhook_config: { url: 'https://example.com', method: 'POST' },
|
||||
...overrides,
|
||||
};
|
||||
}
|
||||
|
||||
describe('useIntegrationHandlers', () => {
|
||||
beforeEach(() => {
|
||||
integrationState.integrations = [];
|
||||
secureStorageState.available = true;
|
||||
secureSecretsState.saveSecrets.mockReset();
|
||||
secureSecretsState.clearSecrets.mockReset();
|
||||
oidcProvidersState.createProvider.mockReset();
|
||||
oidcProvidersState.updateProvider.mockReset();
|
||||
apiState.testOidcConnection.mockReset();
|
||||
vi.mocked(preferences.getIntegrations).mockClear();
|
||||
vi.mocked(preferences.updateIntegration).mockClear();
|
||||
vi.mocked(preferences.addCustomIntegration).mockClear();
|
||||
vi.mocked(preferences.removeIntegration).mockClear();
|
||||
vi.mocked(toast).mockClear();
|
||||
});
|
||||
|
||||
it('blocks toggling when credentials are missing', () => {
|
||||
const integration = createIntegration({
|
||||
type: 'calendar',
|
||||
name: 'Google Calendar',
|
||||
oauth_config: undefined,
|
||||
});
|
||||
integrationState.integrations = [integration];
|
||||
const setIntegrations = vi.fn();
|
||||
|
||||
const { result } = renderHook(() =>
|
||||
useIntegrationHandlers({ integrations: integrationState.integrations, setIntegrations })
|
||||
);
|
||||
|
||||
act(() => {
|
||||
result.current.handleIntegrationToggle(integration);
|
||||
});
|
||||
|
||||
expect(toast).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
title: 'Missing credentials',
|
||||
variant: 'destructive',
|
||||
})
|
||||
);
|
||||
expect(preferences.updateIntegration).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('toggles integration status when configured', () => {
|
||||
const integration = createIntegration({
|
||||
id: 'integration-2',
|
||||
type: 'custom',
|
||||
status: 'disconnected',
|
||||
webhook_config: { url: 'https://example.com', method: 'POST' },
|
||||
});
|
||||
integrationState.integrations = [integration];
|
||||
const setIntegrations = vi.fn();
|
||||
|
||||
const { result } = renderHook(() =>
|
||||
useIntegrationHandlers({ integrations: integrationState.integrations, setIntegrations })
|
||||
);
|
||||
|
||||
act(() => {
|
||||
result.current.handleIntegrationToggle(integration);
|
||||
});
|
||||
|
||||
expect(preferences.updateIntegration).toHaveBeenCalledWith(integration.id, {
|
||||
status: 'connected',
|
||||
});
|
||||
expect(setIntegrations).toHaveBeenCalledWith(preferences.getIntegrations());
|
||||
expect(toast).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
title: 'Connected',
|
||||
})
|
||||
);
|
||||
});
|
||||
|
||||
it('persists secrets when updating configs with secure storage', async () => {
|
||||
const integration = createIntegration({
|
||||
id: 'integration-3',
|
||||
type: 'email',
|
||||
email_config: { provider_type: 'api', api_key: 'old-key' },
|
||||
});
|
||||
integrationState.integrations = [integration];
|
||||
const setIntegrations = vi.fn();
|
||||
|
||||
const { result } = renderHook(() =>
|
||||
useIntegrationHandlers({ integrations: integrationState.integrations, setIntegrations })
|
||||
);
|
||||
|
||||
await act(async () => {
|
||||
await result.current.handleUpdateIntegrationConfig(integration.id, {
|
||||
email_config: { provider_type: 'api', api_key: 'new-key' },
|
||||
});
|
||||
});
|
||||
|
||||
expect(secureSecretsState.saveSecrets).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
id: integration.id,
|
||||
email_config: { provider_type: 'api', api_key: 'new-key' },
|
||||
})
|
||||
);
|
||||
});
|
||||
|
||||
it('marks integration as error when testing without required fields', async () => {
|
||||
const integration = createIntegration({
|
||||
id: 'integration-4',
|
||||
type: 'email',
|
||||
email_config: { provider_type: 'api' },
|
||||
});
|
||||
integrationState.integrations = [integration];
|
||||
const setIntegrations = vi.fn();
|
||||
const setTesting = vi.fn();
|
||||
|
||||
const { result } = renderHook(() =>
|
||||
useIntegrationHandlers({ integrations: integrationState.integrations, setIntegrations })
|
||||
);
|
||||
|
||||
await act(async () => {
|
||||
await result.current.handleTestIntegration(integration, setTesting);
|
||||
});
|
||||
|
||||
expect(preferences.updateIntegration).toHaveBeenCalledWith(integration.id, {
|
||||
status: 'error',
|
||||
error_message: 'Missing required fields',
|
||||
});
|
||||
expect(setTesting).toHaveBeenCalledWith(integration.id);
|
||||
expect(setTesting).toHaveBeenCalledWith(null);
|
||||
});
|
||||
|
||||
it('tests OIDC integrations through the API when configured', async () => {
|
||||
const integration = createIntegration({
|
||||
id: 'integration-5',
|
||||
type: 'oidc',
|
||||
name: 'Auth0',
|
||||
integration_id: 'oidc-1',
|
||||
oidc_config: {
|
||||
preset: 'auth0',
|
||||
issuer_url: 'https://auth0.test',
|
||||
client_id: 'client-1',
|
||||
scopes: ['openid', 'profile'],
|
||||
claim_mapping: DEFAULT_OIDC_CLAIM_MAPPING,
|
||||
require_email_verified: false,
|
||||
allowed_groups: [],
|
||||
},
|
||||
});
|
||||
integrationState.integrations = [integration];
|
||||
const setIntegrations = vi.fn();
|
||||
const setTesting = vi.fn();
|
||||
|
||||
apiState.testOidcConnection.mockResolvedValue({
|
||||
results: { 'oidc-1': '' },
|
||||
success_count: 1,
|
||||
failure_count: 0,
|
||||
});
|
||||
|
||||
const { result } = renderHook(() =>
|
||||
useIntegrationHandlers({ integrations: integrationState.integrations, setIntegrations })
|
||||
);
|
||||
|
||||
await act(async () => {
|
||||
await result.current.handleTestIntegration(integration, setTesting);
|
||||
});
|
||||
|
||||
expect(apiState.testOidcConnection).toHaveBeenCalledWith('oidc-1');
|
||||
expect(preferences.updateIntegration).toHaveBeenCalledWith(
|
||||
integration.id,
|
||||
expect.objectContaining({
|
||||
status: 'connected',
|
||||
})
|
||||
);
|
||||
expect(toast).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
title: 'Connection test passed',
|
||||
})
|
||||
);
|
||||
});
|
||||
|
||||
it('short-circuits test-all when no integrations are configured', async () => {
|
||||
const integration = createIntegration({
|
||||
id: 'integration-6',
|
||||
type: 'calendar',
|
||||
name: 'Google Calendar',
|
||||
oauth_config: undefined,
|
||||
});
|
||||
integrationState.integrations = [integration];
|
||||
const setIntegrations = vi.fn();
|
||||
const setTestingAll = vi.fn();
|
||||
|
||||
const { result } = renderHook(() =>
|
||||
useIntegrationHandlers({ integrations: integrationState.integrations, setIntegrations })
|
||||
);
|
||||
|
||||
await act(async () => {
|
||||
await result.current.handleTestAllIntegrations(setTestingAll);
|
||||
});
|
||||
|
||||
expect(setTestingAll).toHaveBeenCalledWith(true);
|
||||
expect(setTestingAll).toHaveBeenCalledWith(false);
|
||||
expect(toast).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
title: 'No configured integrations',
|
||||
})
|
||||
);
|
||||
});
|
||||
});
|
||||
@@ -6,13 +6,7 @@
|
||||
*/
|
||||
|
||||
import { useCallback, useEffect, useState } from 'react';
|
||||
import {
|
||||
CheckCircle2,
|
||||
Loader2,
|
||||
RefreshCw,
|
||||
Server,
|
||||
XCircle,
|
||||
} from 'lucide-react';
|
||||
import { CheckCircle2, Loader2, RefreshCw, Server, XCircle } from 'lucide-react';
|
||||
import { Badge } from '@/components/ui/badge';
|
||||
import { Button } from '@/components/ui/button';
|
||||
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card';
|
||||
|
||||
@@ -189,9 +189,7 @@ export function ProviderConfigCard({
|
||||
onChange={(e) => onModelChange(e.target.value)}
|
||||
placeholder="Enter model name"
|
||||
/>
|
||||
{manualModelHint && (
|
||||
<p className="text-xs text-muted-foreground">{manualModelHint}</p>
|
||||
)}
|
||||
{manualModelHint && <p className="text-xs text-muted-foreground">{manualModelHint}</p>}
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -244,9 +242,7 @@ export function ProviderConfigCard({
|
||||
<Button
|
||||
variant="outline"
|
||||
onClick={onTestEndpoint}
|
||||
disabled={
|
||||
isTestingEndpoint || !config.base_url || (requiresApiKey && !config.api_key)
|
||||
}
|
||||
disabled={isTestingEndpoint || !config.base_url || (requiresApiKey && !config.api_key)}
|
||||
>
|
||||
{isTestingEndpoint ? (
|
||||
<Loader2 className={iconWithMargin.mdSpin} />
|
||||
|
||||
@@ -62,9 +62,7 @@ export function RecordingAppPolicySection({
|
||||
}
|
||||
return installedApps.filter((app) => {
|
||||
const subtext = getAppSubtext(app) ?? '';
|
||||
return (
|
||||
app.name.toLowerCase().includes(query) || subtext.toLowerCase().includes(query)
|
||||
);
|
||||
return app.name.toLowerCase().includes(query) || subtext.toLowerCase().includes(query);
|
||||
});
|
||||
}, [installedApps, search]);
|
||||
|
||||
@@ -83,9 +81,7 @@ export function RecordingAppPolicySection({
|
||||
<div className="flex items-center gap-2">
|
||||
{icon}
|
||||
<MediumLabel>{title}</MediumLabel>
|
||||
<Badge variant="secondary">
|
||||
{items.length}
|
||||
</Badge>
|
||||
<Badge variant="secondary">{items.length}</Badge>
|
||||
</div>
|
||||
{items.length === 0 ? (
|
||||
<p className="text-sm text-muted-foreground">{emptyText}</p>
|
||||
@@ -100,11 +96,7 @@ export function RecordingAppPolicySection({
|
||||
<p className="text-sm font-medium">{rule.label}</p>
|
||||
<p className="text-xs text-muted-foreground">{rule.id}</p>
|
||||
</div>
|
||||
<Button
|
||||
size="sm"
|
||||
variant={badgeVariant}
|
||||
onClick={() => onRemove(rule.id)}
|
||||
>
|
||||
<Button size="sm" variant={badgeVariant} onClick={() => onRemove(rule.id)}>
|
||||
Remove
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
@@ -244,13 +244,21 @@ export function SummarizationTemplatesList({
|
||||
Versions
|
||||
</Button>
|
||||
{canManageTemplates && !template.is_system && !template.is_archived && (
|
||||
<Button size="sm" variant="outline" onClick={() => void beginEditTemplate(template)}>
|
||||
<Button
|
||||
size="sm"
|
||||
variant="outline"
|
||||
onClick={() => void beginEditTemplate(template)}
|
||||
>
|
||||
<Pencil className="h-3.5 w-3.5 mr-1" />
|
||||
Edit
|
||||
</Button>
|
||||
)}
|
||||
{canManageTemplates && !template.is_system && !template.is_archived && (
|
||||
<Button size="sm" variant="ghost" onClick={() => void archiveTemplate(template)}>
|
||||
<Button
|
||||
size="sm"
|
||||
variant="ghost"
|
||||
onClick={() => void archiveTemplate(template)}
|
||||
>
|
||||
<Archive className="h-3.5 w-3.5 mr-1" />
|
||||
Archive
|
||||
</Button>
|
||||
@@ -263,7 +271,10 @@ export function SummarizationTemplatesList({
|
||||
<div className="grid gap-3 sm:grid-cols-2">
|
||||
<div className="space-y-1">
|
||||
<Label>Name</Label>
|
||||
<Input value={editingName} onChange={(event) => setEditingName(event.target.value)} />
|
||||
<Input
|
||||
value={editingName}
|
||||
onChange={(event) => setEditingName(event.target.value)}
|
||||
/>
|
||||
</div>
|
||||
<div className="space-y-1">
|
||||
<Label>Description</Label>
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { useRef } from 'react';
|
||||
import { useNavigate } from 'react-router-dom';
|
||||
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import { TauriEvents } from '@/api/tauri-adapter';
|
||||
import { ToastAction } from '@/components/ui/toast';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
|
||||
@@ -8,6 +8,7 @@ import { ScrollArea } from '@/components/ui/scroll-area';
|
||||
import { Switch } from '@/components/ui/switch';
|
||||
import { Textarea } from '@/components/ui/textarea';
|
||||
import { formatElapsedTime } from '@/lib/format';
|
||||
import { generateUuid } from '@/lib/id-utils';
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
export interface NoteEdit {
|
||||
@@ -70,7 +71,7 @@ export function TimestampedNotesEditor({
|
||||
|
||||
const isFirstNote = notes.length === 0;
|
||||
const newNote: NoteEdit = {
|
||||
id: crypto.randomUUID(),
|
||||
id: generateUuid(),
|
||||
timestamp: elapsedTime,
|
||||
createdAt: new Date(),
|
||||
content: trimmed,
|
||||
|
||||
@@ -8,32 +8,29 @@ import * as React from 'react';
|
||||
import { cva, type VariantProps } from '@/lib/cva';
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
const iconCircleVariants = cva(
|
||||
'inline-flex items-center justify-center rounded-full shrink-0',
|
||||
{
|
||||
variants: {
|
||||
variant: {
|
||||
default: 'bg-muted text-muted-foreground',
|
||||
primary: 'bg-primary/10 text-primary',
|
||||
success: 'bg-success/10 text-success',
|
||||
warning: 'bg-warning/10 text-warning',
|
||||
destructive: 'bg-destructive/10 text-destructive',
|
||||
info: 'bg-blue-500/10 text-blue-500',
|
||||
outline: 'border border-border bg-transparent',
|
||||
},
|
||||
size: {
|
||||
sm: 'h-6 w-6 [&_svg]:h-3 [&_svg]:w-3',
|
||||
default: 'h-8 w-8 [&_svg]:h-4 [&_svg]:w-4',
|
||||
lg: 'h-10 w-10 [&_svg]:h-5 [&_svg]:w-5',
|
||||
xl: 'h-12 w-12 [&_svg]:h-6 [&_svg]:w-6',
|
||||
},
|
||||
const iconCircleVariants = cva('inline-flex items-center justify-center rounded-full shrink-0', {
|
||||
variants: {
|
||||
variant: {
|
||||
default: 'bg-muted text-muted-foreground',
|
||||
primary: 'bg-primary/10 text-primary',
|
||||
success: 'bg-success/10 text-success',
|
||||
warning: 'bg-warning/10 text-warning',
|
||||
destructive: 'bg-destructive/10 text-destructive',
|
||||
info: 'bg-blue-500/10 text-blue-500',
|
||||
outline: 'border border-border bg-transparent',
|
||||
},
|
||||
defaultVariants: {
|
||||
variant: 'default',
|
||||
size: 'default',
|
||||
size: {
|
||||
sm: 'h-6 w-6 [&_svg]:h-3 [&_svg]:w-3',
|
||||
default: 'h-8 w-8 [&_svg]:h-4 [&_svg]:w-4',
|
||||
lg: 'h-10 w-10 [&_svg]:h-5 [&_svg]:w-5',
|
||||
xl: 'h-12 w-12 [&_svg]:h-6 [&_svg]:w-6',
|
||||
},
|
||||
}
|
||||
);
|
||||
},
|
||||
defaultVariants: {
|
||||
variant: 'default',
|
||||
size: 'default',
|
||||
},
|
||||
});
|
||||
|
||||
export interface IconCircleProps
|
||||
extends React.HTMLAttributes<HTMLDivElement>,
|
||||
@@ -65,11 +62,7 @@ export interface IconCircleProps
|
||||
const IconCircle = React.forwardRef<HTMLDivElement, IconCircleProps>(
|
||||
({ className, variant, size, children, ...props }, ref) => {
|
||||
return (
|
||||
<div
|
||||
ref={ref}
|
||||
className={cn(iconCircleVariants({ variant, size, className }))}
|
||||
{...props}
|
||||
>
|
||||
<div ref={ref} className={cn(iconCircleVariants({ variant, size, className }))} {...props}>
|
||||
{children}
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -3,16 +3,23 @@ import * as React from 'react';
|
||||
|
||||
import { cn } from '@/lib/utils';
|
||||
|
||||
type ScrollAreaProps = React.ComponentPropsWithoutRef<typeof ScrollAreaPrimitive.Root> & {
|
||||
viewportRef?: React.Ref<HTMLDivElement>;
|
||||
};
|
||||
|
||||
const ScrollArea = React.forwardRef<
|
||||
React.ElementRef<typeof ScrollAreaPrimitive.Root>,
|
||||
React.ComponentPropsWithoutRef<typeof ScrollAreaPrimitive.Root>
|
||||
>(({ className, children, ...props }, ref) => (
|
||||
ScrollAreaProps
|
||||
>(({ className, children, viewportRef, ...props }, ref) => (
|
||||
<ScrollAreaPrimitive.Root
|
||||
ref={ref}
|
||||
className={cn('relative overflow-hidden', className)}
|
||||
{...props}
|
||||
>
|
||||
<ScrollAreaPrimitive.Viewport className="h-full w-full rounded-[inherit]">
|
||||
<ScrollAreaPrimitive.Viewport
|
||||
ref={viewportRef}
|
||||
className="h-full w-full rounded-[inherit]"
|
||||
>
|
||||
{children}
|
||||
</ScrollAreaPrimitive.Viewport>
|
||||
<ScrollBar />
|
||||
|
||||
@@ -1,16 +1,7 @@
|
||||
// Webhook management settings panel component
|
||||
|
||||
import { formatDistanceToNow } from 'date-fns';
|
||||
import {
|
||||
AlertCircle,
|
||||
History,
|
||||
Loader2,
|
||||
Plus,
|
||||
Settings2,
|
||||
Trash2,
|
||||
Webhook,
|
||||
X,
|
||||
} from 'lucide-react';
|
||||
import { AlertCircle, History, Loader2, Plus, Settings2, Trash2, Webhook, X } from 'lucide-react';
|
||||
import { useEffect, useState } from 'react';
|
||||
import { Placeholders, Timing } from '@/api/constants';
|
||||
import type { RegisteredWebhook, WebhookDelivery, WebhookEventType } from '@/api/types';
|
||||
|
||||
@@ -2,7 +2,12 @@
|
||||
// (Sprint GAP-007: Simulation Mode Clarity - expose mode and simulation state)
|
||||
|
||||
import { useEffect, useMemo, useState } from 'react';
|
||||
import { getConnectionState, setConnectionMode, setConnectionServerUrl, subscribeConnectionState } from '@/api/connection-state';
|
||||
import {
|
||||
getConnectionState,
|
||||
setConnectionMode,
|
||||
setConnectionServerUrl,
|
||||
subscribeConnectionState,
|
||||
} from '@/api/connection-state';
|
||||
import type { ConnectionState } from '@/api/connection-state';
|
||||
import { TauriEvents } from '@/api/tauri-adapter';
|
||||
import { ConnectionContext, type ConnectionHelpers } from '@/contexts/connection-state';
|
||||
|
||||
@@ -161,9 +161,7 @@ export function ProjectProvider({ children }: { children: React.ReactNode }) {
|
||||
const updated = await getAPI().archiveProject(projectId);
|
||||
// Use functional state update to avoid stale closure
|
||||
setProjects((prev) => {
|
||||
const nextProjects = prev.map((project) =>
|
||||
project.id === updated.id ? updated : project
|
||||
);
|
||||
const nextProjects = prev.map((project) => (project.id === updated.id ? updated : project));
|
||||
// Handle active project switch inside the updater to use fresh state
|
||||
if (activeProjectId === projectId && currentWorkspace) {
|
||||
const nextActive = resolveActiveProject(nextProjects, null);
|
||||
|
||||
@@ -106,7 +106,8 @@ export function WorkspaceProvider({ children }: { children: React.ReactNode }) {
|
||||
const response = await api.switchWorkspace(workspaceId);
|
||||
// Use ref to get current workspaces without stale closure
|
||||
const selected =
|
||||
response.workspace ?? workspacesRef.current.find((workspace) => workspace.id === workspaceId);
|
||||
response.workspace ??
|
||||
workspacesRef.current.find((workspace) => workspace.id === workspaceId);
|
||||
if (!response.success || !selected) {
|
||||
throw new Error('Workspace not found');
|
||||
}
|
||||
|
||||
@@ -137,9 +137,7 @@ export function shouldAutoStartProcessing(
|
||||
// If processing is already complete or in progress, don't restart
|
||||
const { summary, entities, diarization } = processingStatus;
|
||||
const anyFailed =
|
||||
summary.status === 'failed' ||
|
||||
entities.status === 'failed' ||
|
||||
diarization.status === 'failed';
|
||||
summary.status === 'failed' || entities.status === 'failed' || diarization.status === 'failed';
|
||||
const allTerminal =
|
||||
['completed', 'failed', 'skipped'].includes(summary.status) &&
|
||||
['completed', 'failed', 'skipped'].includes(entities.status) &&
|
||||
|
||||
@@ -17,6 +17,8 @@ import { useConnectionState } from '@/contexts/connection-state';
|
||||
|
||||
/** Polling interval for job status (ms) */
|
||||
const JOB_POLL_INTERVAL = 500;
|
||||
/** Maximum polling duration before timeout (5 minutes) */
|
||||
const MAX_POLL_DURATION_MS = 5 * 60 * 1000;
|
||||
|
||||
/** Load state for async operations. */
|
||||
type LoadState = 'idle' | 'loading' | 'ready' | 'failed';
|
||||
@@ -59,15 +61,23 @@ interface UseAsrConfigReturn {
|
||||
|
||||
export function useAsrConfig(): UseAsrConfigReturn {
|
||||
const [state, setState] = useState<AsrConfigState>(initialState);
|
||||
const pollingRef = useRef<number | null>(null);
|
||||
const { mode } = useConnectionState();
|
||||
const lastModeRef = useRef(mode);
|
||||
|
||||
// Polling state with generation token for proper cancellation
|
||||
const pollTimeoutRef = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||
const pollGenerationRef = useRef(0);
|
||||
const pollStartTimeRef = useRef<number | null>(null);
|
||||
const isMountedRef = useRef(true);
|
||||
|
||||
const cancelPolling = useCallback(() => {
|
||||
if (pollingRef.current !== null) {
|
||||
window.clearInterval(pollingRef.current);
|
||||
pollingRef.current = null;
|
||||
// Increment generation to invalidate any in-flight polls
|
||||
pollGenerationRef.current++;
|
||||
if (pollTimeoutRef.current !== null) {
|
||||
clearTimeout(pollTimeoutRef.current);
|
||||
pollTimeoutRef.current = null;
|
||||
}
|
||||
pollStartTimeRef.current = null;
|
||||
}, []);
|
||||
|
||||
// Load initial configuration
|
||||
@@ -86,6 +96,14 @@ export function useAsrConfig(): UseAsrConfigReturn {
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Mount/unmount tracking
|
||||
useEffect(() => {
|
||||
isMountedRef.current = true;
|
||||
return () => {
|
||||
isMountedRef.current = false;
|
||||
};
|
||||
}, []);
|
||||
|
||||
useEffect(() => {
|
||||
void refresh();
|
||||
return cancelPolling;
|
||||
@@ -101,12 +119,37 @@ export function useAsrConfig(): UseAsrConfigReturn {
|
||||
}
|
||||
}, [mode, refresh]);
|
||||
|
||||
// Poll for job status updates
|
||||
// Poll for job status updates with generation token for proper cancellation
|
||||
const pollJobStatus = useCallback(
|
||||
async (jobId: string) => {
|
||||
async (jobId: string, generation: number) => {
|
||||
// Check if this poll has been cancelled (generation changed)
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Check max poll duration
|
||||
if (pollStartTimeRef.current !== null) {
|
||||
const elapsed = Date.now() - pollStartTimeRef.current;
|
||||
if (elapsed > MAX_POLL_DURATION_MS) {
|
||||
cancelPolling();
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
isReconfiguring: false,
|
||||
errorMessage: 'Reconfiguration timed out',
|
||||
}));
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
const api = getAPI();
|
||||
const status = await api.getAsrJobStatus(jobId);
|
||||
|
||||
// Re-check generation after async work
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
setState((prev) => ({ ...prev, jobStatus: status }));
|
||||
|
||||
if (status.status === 'completed') {
|
||||
@@ -136,8 +179,18 @@ export function useAsrConfig(): UseAsrConfigReturn {
|
||||
isReconfiguring: false,
|
||||
errorMessage: status.errorMessage || 'Reconfiguration cancelled',
|
||||
}));
|
||||
} else {
|
||||
// Continue polling with setTimeout (not setInterval) for sequential execution
|
||||
pollTimeoutRef.current = setTimeout(
|
||||
() => void pollJobStatus(jobId, generation),
|
||||
JOB_POLL_INTERVAL
|
||||
);
|
||||
}
|
||||
} catch (err) {
|
||||
// Re-check generation after async work
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
cancelPolling();
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
@@ -151,6 +204,8 @@ export function useAsrConfig(): UseAsrConfigReturn {
|
||||
|
||||
const updateConfig = useCallback(
|
||||
async (request: UpdateASRConfigurationRequest): Promise<boolean> => {
|
||||
// Cancel any existing polling before starting new config update
|
||||
cancelPolling();
|
||||
setState((prev) => ({ ...prev, errorMessage: null, isReconfiguring: true }));
|
||||
|
||||
try {
|
||||
@@ -166,7 +221,10 @@ export function useAsrConfig(): UseAsrConfigReturn {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Start polling for job status
|
||||
// Start polling for job status with new generation token
|
||||
const generation = ++pollGenerationRef.current;
|
||||
pollStartTimeRef.current = Date.now();
|
||||
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
jobStatus: {
|
||||
@@ -179,9 +237,11 @@ export function useAsrConfig(): UseAsrConfigReturn {
|
||||
},
|
||||
}));
|
||||
|
||||
pollingRef.current = window.setInterval(() => {
|
||||
void pollJobStatus(result.jobId);
|
||||
}, JOB_POLL_INTERVAL);
|
||||
// Use setTimeout for first poll (sequential polling, not overlapping setInterval)
|
||||
pollTimeoutRef.current = setTimeout(
|
||||
() => void pollJobStatus(result.jobId, generation),
|
||||
JOB_POLL_INTERVAL
|
||||
);
|
||||
|
||||
return true;
|
||||
} catch (err) {
|
||||
@@ -193,7 +253,7 @@ export function useAsrConfig(): UseAsrConfigReturn {
|
||||
return false;
|
||||
}
|
||||
},
|
||||
[pollJobStatus]
|
||||
[cancelPolling, pollJobStatus]
|
||||
);
|
||||
|
||||
return {
|
||||
|
||||
@@ -27,7 +27,7 @@ export async function resolveStoredDevice(
|
||||
kind: AudioDeviceMatchKind,
|
||||
setSelected: (id: string) => void
|
||||
): Promise<DeviceResolutionResult> {
|
||||
const hadStoredSelection = Boolean(storedId);
|
||||
const hadStoredSelection = Boolean(storedId || storedName);
|
||||
let resolvedId = storedId;
|
||||
|
||||
// If stored ID matches a current device, use it directly
|
||||
@@ -46,8 +46,8 @@ export async function resolveStoredDevice(
|
||||
return { resolvedId, hadStoredSelection };
|
||||
}
|
||||
|
||||
// Try to resolve the stored ID to a current device
|
||||
if (storedId) {
|
||||
// Try to resolve the stored ID/name to a current device
|
||||
if (storedId || storedName) {
|
||||
const resolved = resolveAudioDeviceId(devices, storedId, kind, storedName);
|
||||
if (resolved) {
|
||||
resolvedId = resolved;
|
||||
@@ -59,7 +59,7 @@ export async function resolveStoredDevice(
|
||||
level: 'info',
|
||||
source: 'app',
|
||||
message: `${kind} device resolved for session (original preserved)`,
|
||||
metadata: { stored_id: storedId, resolved_id: resolved },
|
||||
metadata: { stored_id: storedId, stored_name: storedName, resolved_id: resolved },
|
||||
});
|
||||
} catch (error) {
|
||||
addClientLog({
|
||||
@@ -75,7 +75,7 @@ export async function resolveStoredDevice(
|
||||
level: 'warning',
|
||||
source: 'app',
|
||||
message: `Stored ${kind} device not available`,
|
||||
metadata: { device_id: storedId },
|
||||
metadata: { device_id: storedId, device_name: storedName },
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import { act, renderHook } from '@testing-library/react';
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import { TauriCommands } from '@/api/constants';
|
||||
import { isTauriEnvironment } from '@/api/tauri-adapter';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
@@ -8,7 +8,7 @@ import { preferences } from '@/lib/preferences';
|
||||
import { useTauriEvent } from '@/lib/tauri-events';
|
||||
import { useAudioDevices } from './use-audio-devices';
|
||||
|
||||
vi.mock('@/api', () => ({
|
||||
vi.mock('@/api/interface', () => ({
|
||||
getAPI: vi.fn(),
|
||||
}));
|
||||
|
||||
@@ -388,9 +388,9 @@ describe('useAudioDevices', () => {
|
||||
|
||||
const selectAudioDevice = vi.fn();
|
||||
vi.mocked(getAPI).mockReturnValue({
|
||||
getPreferences: vi.fn().mockImplementation(() =>
|
||||
Promise.resolve({ audio_devices: prefsState.audio_devices })
|
||||
),
|
||||
getPreferences: vi
|
||||
.fn()
|
||||
.mockImplementation(() => Promise.resolve({ audio_devices: prefsState.audio_devices })),
|
||||
listAudioDevices: vi.fn().mockResolvedValue([
|
||||
{ id: 'input:Mic', name: 'Mic', is_input: true },
|
||||
{ id: 'output:Speakers', name: 'Speakers', is_input: false },
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import { clientLog } from '@/lib/client-log-events';
|
||||
import { addClientLog } from '@/lib/client-logs';
|
||||
import { isTauriEnvironment } from '@/api/tauri-adapter';
|
||||
@@ -18,7 +18,12 @@ import type {
|
||||
|
||||
export type { AudioDevice, UseAudioDevicesOptions, UseAudioDevicesReturn };
|
||||
|
||||
addClientLog({ level: 'debug', source: 'app', message: 'use-audio-devices.ts MODULE LOADED', metadata: { isTauri: isTauriEnvironment() } });
|
||||
addClientLog({
|
||||
level: 'debug',
|
||||
source: 'app',
|
||||
message: 'use-audio-devices.ts MODULE LOADED',
|
||||
metadata: { isTauri: isTauriEnvironment() },
|
||||
});
|
||||
|
||||
const log = debug('useAudioDevices');
|
||||
|
||||
@@ -101,21 +106,21 @@ export function useAudioDevices(options: UseAudioDevicesOptions = {}): UseAudioD
|
||||
|
||||
useEffect(() => {
|
||||
return preferences.subscribe((prefs) => {
|
||||
const newInputId = prefs.audio_devices.input_device_id;
|
||||
const newOutputId = prefs.audio_devices.output_device_id;
|
||||
const newSystemId = prefs.audio_devices.system_device_id ?? '';
|
||||
const newDualCapture = prefs.audio_devices.dual_capture_enabled ?? false;
|
||||
const newMicGain = prefs.audio_devices.mic_gain ?? 1.0;
|
||||
const newSystemGain = prefs.audio_devices.system_gain ?? 1.0;
|
||||
setSelectedInputDevice((current) => (current !== newInputId ? newInputId : current));
|
||||
setSelectedOutputDevice((current) => (current !== newOutputId ? newOutputId : current));
|
||||
setSelectedSystemDevice((current) => (current !== newSystemId ? newSystemId : current));
|
||||
setDualCaptureEnabledState((current) =>
|
||||
current !== newDualCapture ? newDualCapture : current
|
||||
);
|
||||
setMicGain((current) => (current !== newMicGain ? newMicGain : current));
|
||||
setSystemGain((current) => (current !== newSystemGain ? newSystemGain : current));
|
||||
});
|
||||
const newInputId = prefs.audio_devices.input_device_id;
|
||||
const newOutputId = prefs.audio_devices.output_device_id;
|
||||
const newSystemId = prefs.audio_devices.system_device_id ?? '';
|
||||
const newDualCapture = prefs.audio_devices.dual_capture_enabled ?? false;
|
||||
const newMicGain = prefs.audio_devices.mic_gain ?? 1.0;
|
||||
const newSystemGain = prefs.audio_devices.system_gain ?? 1.0;
|
||||
setSelectedInputDevice((current) => (current !== newInputId ? newInputId : current));
|
||||
setSelectedOutputDevice((current) => (current !== newOutputId ? newOutputId : current));
|
||||
setSelectedSystemDevice((current) => (current !== newSystemId ? newSystemId : current));
|
||||
setDualCaptureEnabledState((current) =>
|
||||
current !== newDualCapture ? newDualCapture : current
|
||||
);
|
||||
setMicGain((current) => (current !== newMicGain ? newMicGain : current));
|
||||
setSystemGain((current) => (current !== newSystemGain ? newSystemGain : current));
|
||||
});
|
||||
}, []);
|
||||
|
||||
const selectedInputDeviceRef = useRef(selectedInputDevice);
|
||||
@@ -129,7 +134,12 @@ export function useAudioDevices(options: UseAudioDevicesOptions = {}): UseAudioD
|
||||
|
||||
const loadDevices = useCallback(async () => {
|
||||
setIsLoading(true);
|
||||
addClientLog({ level: 'debug', source: 'app', message: 'loadDevices: ENTRY', metadata: { isTauri: isTauriEnvironment() } });
|
||||
addClientLog({
|
||||
level: 'debug',
|
||||
source: 'app',
|
||||
message: 'loadDevices: ENTRY',
|
||||
metadata: { isTauri: isTauriEnvironment() },
|
||||
});
|
||||
|
||||
try {
|
||||
if (isTauriEnvironment()) {
|
||||
@@ -224,10 +234,20 @@ export function useAudioDevices(options: UseAudioDevicesOptions = {}): UseAudioD
|
||||
}
|
||||
|
||||
const inputResult = await resolveStoredDevice(
|
||||
api, inputs, storedInputId, storedInputName, 'input', setSelectedInputDevice
|
||||
api,
|
||||
inputs,
|
||||
storedInputId,
|
||||
storedInputName,
|
||||
'input',
|
||||
setSelectedInputDevice
|
||||
);
|
||||
const outputResult = await resolveStoredDevice(
|
||||
api, outputs, storedOutputId, storedOutputName, 'output', setSelectedOutputDevice
|
||||
api,
|
||||
outputs,
|
||||
storedOutputId,
|
||||
storedOutputName,
|
||||
'output',
|
||||
setSelectedOutputDevice
|
||||
);
|
||||
|
||||
if (inputs.length > 0 && !inputResult.resolvedId && !inputResult.hadStoredSelection) {
|
||||
@@ -434,7 +454,6 @@ export function useAudioDevices(options: UseAudioDevicesOptions = {}): UseAudioD
|
||||
},
|
||||
[showToasts]
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
if (!autoLoad || !isHydratedState) {
|
||||
return;
|
||||
@@ -445,7 +464,6 @@ export function useAudioDevices(options: UseAudioDevicesOptions = {}): UseAudioD
|
||||
autoLoadRef.current = true;
|
||||
void loadDevices();
|
||||
}, [autoLoad, isHydratedState, loadDevices]);
|
||||
|
||||
useEffect(() => {
|
||||
return () => {
|
||||
void stopInputTest();
|
||||
|
||||
@@ -5,7 +5,7 @@
|
||||
* Extracted from use-audio-devices.ts to keep files under 500 lines.
|
||||
*/
|
||||
|
||||
import { useCallback, useRef, useState } from 'react';
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
import { TauriCommands, Timing } from '@/api/constants';
|
||||
import { isTauriEnvironment, TauriEvents } from '@/api/tauri-adapter';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
@@ -55,6 +55,9 @@ export function useAudioTesting({
|
||||
const analyserRef = useRef<AnalyserNode | null>(null);
|
||||
const mediaStreamRef = useRef<MediaStream | null>(null);
|
||||
const animationFrameRef = useRef<number | null>(null);
|
||||
// Ref for output test timeout to ensure cleanup on unmount
|
||||
const outputTestTimeoutRef = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||
const isMountedRef = useRef(true);
|
||||
|
||||
/**
|
||||
* Start testing the selected input device (microphone level visualization)
|
||||
@@ -162,6 +165,12 @@ export function useAudioTesting({
|
||||
* Test the output device by playing a tone
|
||||
*/
|
||||
const testOutputDevice = useCallback(async () => {
|
||||
// Clear any existing output test timeout
|
||||
if (outputTestTimeoutRef.current) {
|
||||
clearTimeout(outputTestTimeoutRef.current);
|
||||
outputTestTimeoutRef.current = null;
|
||||
}
|
||||
|
||||
if (isTauriEnvironment()) {
|
||||
try {
|
||||
const core: typeof import('@tauri-apps/api/core') = await import('@tauri-apps/api/core');
|
||||
@@ -172,8 +181,13 @@ export function useAudioTesting({
|
||||
if (showToasts) {
|
||||
toast({ title: 'Output test', description: 'Playing test tone' });
|
||||
}
|
||||
// Output test auto-stops after 2 seconds
|
||||
setTimeout(() => setIsTestingOutput(false), Timing.TWO_SECONDS_MS);
|
||||
// Output test auto-stops after 2 seconds - track timeout for cleanup
|
||||
outputTestTimeoutRef.current = setTimeout(() => {
|
||||
if (isMountedRef.current) {
|
||||
setIsTestingOutput(false);
|
||||
}
|
||||
outputTestTimeoutRef.current = null;
|
||||
}, Timing.TWO_SECONDS_MS);
|
||||
} catch (err) {
|
||||
if (showToasts) {
|
||||
toastError({
|
||||
@@ -208,9 +222,13 @@ export function useAudioTesting({
|
||||
toast({ title: 'Output test', description: 'Playing test tone' });
|
||||
}
|
||||
|
||||
setTimeout(() => {
|
||||
setIsTestingOutput(false);
|
||||
// Track timeout for cleanup on unmount
|
||||
outputTestTimeoutRef.current = setTimeout(() => {
|
||||
if (isMountedRef.current) {
|
||||
setIsTestingOutput(false);
|
||||
}
|
||||
audioContext.close();
|
||||
outputTestTimeoutRef.current = null;
|
||||
}, 500);
|
||||
} catch {
|
||||
if (showToasts) {
|
||||
@@ -238,6 +256,18 @@ export function useAudioTesting({
|
||||
[isTestingInput]
|
||||
);
|
||||
|
||||
// Cleanup on unmount - track mount state and clear output timeout
|
||||
useEffect(() => {
|
||||
isMountedRef.current = true;
|
||||
return () => {
|
||||
isMountedRef.current = false;
|
||||
if (outputTestTimeoutRef.current) {
|
||||
clearTimeout(outputTestTimeoutRef.current);
|
||||
outputTestTimeoutRef.current = null;
|
||||
}
|
||||
};
|
||||
}, []);
|
||||
|
||||
return {
|
||||
isTestingInput,
|
||||
isTestingOutput,
|
||||
|
||||
@@ -30,6 +30,7 @@ interface UseCalendarSyncReturn {
|
||||
hoursAhead?: number;
|
||||
limit?: number;
|
||||
provider?: string;
|
||||
background?: boolean;
|
||||
}) => Promise<void>;
|
||||
fetchProviders: () => Promise<void>;
|
||||
refresh: () => Promise<void>;
|
||||
@@ -56,7 +57,9 @@ export function useCalendarSync(options: UseCalendarSyncOptions = {}): UseCalend
|
||||
|
||||
const [state, setState] = useState<CalendarSyncState>(initialState);
|
||||
const [isAutoRefreshing, setIsAutoRefreshing] = useState(false);
|
||||
const intervalRef = useRef<NodeJS.Timeout | null>(null);
|
||||
const intervalRef = useRef<ReturnType<typeof setInterval> | null>(null);
|
||||
// In-flight guard to prevent overlapping refreshes
|
||||
const inFlightRef = useRef(false);
|
||||
const optionsRef = useRef({
|
||||
hoursAhead: defaultHoursAhead,
|
||||
limit: defaultLimit,
|
||||
@@ -87,18 +90,31 @@ export function useCalendarSync(options: UseCalendarSyncOptions = {}): UseCalend
|
||||
}, []);
|
||||
|
||||
const fetchEvents = useCallback(
|
||||
async (fetchOptions?: { hoursAhead?: number; limit?: number; provider?: string }) => {
|
||||
async (fetchOptions?: {
|
||||
hoursAhead?: number;
|
||||
limit?: number;
|
||||
provider?: string;
|
||||
background?: boolean;
|
||||
}) => {
|
||||
const {
|
||||
hoursAhead = optionsRef.current.hoursAhead,
|
||||
limit = optionsRef.current.limit,
|
||||
provider = optionsRef.current.provider,
|
||||
background = false,
|
||||
} = fetchOptions || {};
|
||||
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
status: 'loading',
|
||||
error: null,
|
||||
}));
|
||||
if (inFlightRef.current) {
|
||||
return;
|
||||
}
|
||||
inFlightRef.current = true;
|
||||
|
||||
if (!background) {
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
status: 'loading',
|
||||
error: null,
|
||||
}));
|
||||
}
|
||||
|
||||
try {
|
||||
const api = getAPI();
|
||||
@@ -109,6 +125,7 @@ export function useCalendarSync(options: UseCalendarSyncOptions = {}): UseCalend
|
||||
status: 'success',
|
||||
events: response.events,
|
||||
lastSync: Date.now(),
|
||||
error: null,
|
||||
}));
|
||||
} catch (error) {
|
||||
// Check if this is a stale integration (not found on server)
|
||||
@@ -146,6 +163,8 @@ export function useCalendarSync(options: UseCalendarSyncOptions = {}): UseCalend
|
||||
fallback: 'Failed to fetch calendar events',
|
||||
});
|
||||
}
|
||||
} finally {
|
||||
inFlightRef.current = false;
|
||||
}
|
||||
},
|
||||
[isAutoRefreshing]
|
||||
@@ -169,7 +188,7 @@ export function useCalendarSync(options: UseCalendarSyncOptions = {}): UseCalend
|
||||
setIsAutoRefreshing(true);
|
||||
intervalRef.current = setInterval(() => {
|
||||
// Use ref to get current fetchEvents without stale closure
|
||||
fetchEventsRef.current();
|
||||
fetchEventsRef.current({ background: true });
|
||||
}, autoRefreshInterval);
|
||||
}, [autoRefreshInterval]);
|
||||
|
||||
|
||||
@@ -12,13 +12,13 @@
|
||||
|
||||
import { act, renderHook } from '@testing-library/react';
|
||||
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
|
||||
import * as api from '@/api';
|
||||
import * as api from '@/api/interface';
|
||||
import type { DiarizationJobStatus, JobStatus } from '@/api/types';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
import { useDiarization } from './use-diarization';
|
||||
|
||||
// Mock the API module
|
||||
vi.mock('@/api', () => ({
|
||||
vi.mock('@/api/interface', () => ({
|
||||
getAPI: vi.fn(),
|
||||
}));
|
||||
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
*/
|
||||
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import type { DiarizationJobStatus, JobStatus } from '@/api/types';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
import { PollingConfig } from '@/lib/config';
|
||||
@@ -109,9 +109,13 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
const isMountedRef = useRef(true);
|
||||
/** Track poll start time for max duration check */
|
||||
const pollStartTimeRef = useRef<number | null>(null);
|
||||
/** Generation token to invalidate in-flight polls on stop/reset */
|
||||
const pollGenerationRef = useRef(0);
|
||||
|
||||
/** Stop polling */
|
||||
const stopPolling = useCallback(() => {
|
||||
// Increment generation to invalidate any in-flight polls
|
||||
pollGenerationRef.current++;
|
||||
if (pollTimeoutRef.current) {
|
||||
clearTimeout(pollTimeoutRef.current);
|
||||
pollTimeoutRef.current = null;
|
||||
@@ -123,8 +127,9 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
|
||||
/** Poll for job status with backoff */
|
||||
const poll = useCallback(
|
||||
async (jobId: string) => {
|
||||
if (!isMountedRef.current) {
|
||||
async (jobId: string, generation: number) => {
|
||||
// Check if this poll has been invalidated (generation changed)
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -154,7 +159,8 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
const api = getAPI();
|
||||
const status = await api.getDiarizationJobStatus(jobId);
|
||||
|
||||
if (!isMountedRef.current) {
|
||||
// Re-check generation after async work
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -211,7 +217,7 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
|
||||
if (status.status === 'failed') {
|
||||
stopPolling();
|
||||
setState((prev) => ({ ...prev, isActive: false, error: status.error_message }));
|
||||
setState((prev) => ({ ...prev, isActive: false, error: status.error_message ?? null }));
|
||||
onError?.(status.error_message || 'Diarization failed');
|
||||
if (showToasts) {
|
||||
toast({
|
||||
@@ -235,14 +241,18 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
return;
|
||||
}
|
||||
|
||||
// Continue polling with backoff for running/queued jobs
|
||||
// Continue polling with backoff for running/queued jobs (pass generation for cancellation)
|
||||
currentPollIntervalRef.current = Math.min(
|
||||
currentPollIntervalRef.current * POLL_BACKOFF_MULTIPLIER,
|
||||
MAX_POLL_INTERVAL_MS
|
||||
);
|
||||
pollTimeoutRef.current = setTimeout(() => poll(jobId), currentPollIntervalRef.current);
|
||||
pollTimeoutRef.current = setTimeout(
|
||||
() => poll(jobId, generation),
|
||||
currentPollIntervalRef.current
|
||||
);
|
||||
} catch (error) {
|
||||
if (!isMountedRef.current) {
|
||||
// Re-check generation after async work
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -253,7 +263,7 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
retryCountRef.current += 1;
|
||||
const retryDelay =
|
||||
INITIAL_RETRY_DELAY_MS * RETRY_BACKOFF_MULTIPLIER ** (retryCountRef.current - 1);
|
||||
pollTimeoutRef.current = setTimeout(() => poll(jobId), retryDelay);
|
||||
pollTimeoutRef.current = setTimeout(() => poll(jobId, generation), retryDelay);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -307,12 +317,13 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
if (status.status === 'queued' || status.status === 'running') {
|
||||
// Track poll start time for max duration timeout
|
||||
pollStartTimeRef.current = Date.now();
|
||||
pollTimeoutRef.current = setTimeout(() => poll(status.job_id), pollInterval);
|
||||
const generation = pollGenerationRef.current;
|
||||
pollTimeoutRef.current = setTimeout(() => poll(status.job_id, generation), pollInterval);
|
||||
} else if (status.status === 'completed') {
|
||||
setState((prev) => ({ ...prev, isActive: false }));
|
||||
onComplete?.(status);
|
||||
} else if (status.status === 'failed') {
|
||||
setState((prev) => ({ ...prev, isActive: false, error: status.error_message }));
|
||||
setState((prev) => ({ ...prev, isActive: false, error: status.error_message ?? null }));
|
||||
onError?.(status.error_message || 'Diarization failed');
|
||||
}
|
||||
} catch (error) {
|
||||
@@ -425,7 +436,8 @@ export function useDiarization(options: UseDiarizationOptions = {}): UseDiarizat
|
||||
// Resume polling if job is still active
|
||||
if (job.status === 'queued' || job.status === 'running') {
|
||||
pollStartTimeRef.current = Date.now();
|
||||
pollTimeoutRef.current = setTimeout(() => poll(job.job_id), pollInterval);
|
||||
const generation = pollGenerationRef.current;
|
||||
pollTimeoutRef.current = setTimeout(() => poll(job.job_id, generation), pollInterval);
|
||||
|
||||
if (showToasts) {
|
||||
toast({
|
||||
|
||||
@@ -88,47 +88,44 @@ export function useHuggingFaceToken(): UseHuggingFaceTokenReturn {
|
||||
}
|
||||
}, [mode, refresh]);
|
||||
|
||||
const setToken = useCallback(
|
||||
async (token: string, validate = true): Promise<boolean> => {
|
||||
setState((prev) => ({ ...prev, isSaving: true, errorMessage: null }));
|
||||
const setToken = useCallback(async (token: string, validate = true): Promise<boolean> => {
|
||||
setState((prev) => ({ ...prev, isSaving: true, errorMessage: null }));
|
||||
|
||||
try {
|
||||
const api = getAPI();
|
||||
const result = await api.setHuggingFaceToken({ token, validate });
|
||||
try {
|
||||
const api = getAPI();
|
||||
const result = await api.setHuggingFaceToken({ token, validate });
|
||||
|
||||
if (!result.success) {
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
isSaving: false,
|
||||
errorMessage: result.validationError || 'Failed to save token',
|
||||
}));
|
||||
return false;
|
||||
}
|
||||
|
||||
// Update local status
|
||||
if (!result.success) {
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
isSaving: false,
|
||||
tokenStatus: {
|
||||
isConfigured: true,
|
||||
isValidated: validate && result.valid === true,
|
||||
username: result.username,
|
||||
validatedAt: validate && result.valid === true ? Date.now() / 1000 : null,
|
||||
},
|
||||
}));
|
||||
|
||||
return true;
|
||||
} catch (err) {
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
isSaving: false,
|
||||
errorMessage: extractErrorMessage(err, 'Failed to save HuggingFace token'),
|
||||
errorMessage: result.validationError || 'Failed to save token',
|
||||
}));
|
||||
return false;
|
||||
}
|
||||
},
|
||||
[]
|
||||
);
|
||||
|
||||
// Update local status
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
isSaving: false,
|
||||
tokenStatus: {
|
||||
isConfigured: true,
|
||||
isValidated: validate && result.valid === true,
|
||||
username: result.username,
|
||||
validatedAt: validate && result.valid === true ? Date.now() / 1000 : null,
|
||||
},
|
||||
}));
|
||||
|
||||
return true;
|
||||
} catch (err) {
|
||||
setState((prev) => ({
|
||||
...prev,
|
||||
isSaving: false,
|
||||
errorMessage: extractErrorMessage(err, 'Failed to save HuggingFace token'),
|
||||
}));
|
||||
return false;
|
||||
}
|
||||
}, []);
|
||||
|
||||
const deleteToken = useCallback(async (): Promise<boolean> => {
|
||||
setState((prev) => ({ ...prev, isSaving: true, errorMessage: null }));
|
||||
|
||||
@@ -84,7 +84,6 @@ function sendNotification(type: 'success' | 'error', integrationName: string, me
|
||||
variant: type === 'error' ? 'destructive' : 'default',
|
||||
});
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
/** Result of a sync operation with optional not-found flag for cache invalidation. */
|
||||
@@ -147,11 +146,13 @@ export function useIntegrationSync(): UseSyncSchedulerReturn {
|
||||
const [syncStates, setSyncStates] = useState<IntegrationSyncState>({});
|
||||
const [isSchedulerRunning, setIsSchedulerRunning] = useState(false);
|
||||
const [isPaused, setIsPaused] = useState(() => preferences.isSyncSchedulerPaused());
|
||||
const intervalsRef = useRef<Map<string, NodeJS.Timeout>>(new Map());
|
||||
const intervalsRef = useRef<Map<string, ReturnType<typeof setInterval>>>(new Map());
|
||||
const initialTimeoutsRef = useRef<Map<string, ReturnType<typeof setTimeout>>>(new Map());
|
||||
const integrationsRef = useRef<Integration[]>([]);
|
||||
const pausedRef = useRef(isPaused);
|
||||
const mountedRef = useRef(true);
|
||||
// In-flight guard per integration to prevent overlapping syncs
|
||||
const inFlightRef = useRef<Set<string>>(new Set());
|
||||
|
||||
useEffect(() => {
|
||||
mountedRef.current = true;
|
||||
@@ -169,6 +170,11 @@ export function useIntegrationSync(): UseSyncSchedulerReturn {
|
||||
return;
|
||||
}
|
||||
|
||||
// In-flight guard: skip if sync is already running for this integration
|
||||
if (inFlightRef.current.has(integrationId)) {
|
||||
return;
|
||||
}
|
||||
|
||||
const integration = integrationsRef.current.find((i) => i.id === integrationId);
|
||||
if (!integration) {
|
||||
return;
|
||||
@@ -180,9 +186,12 @@ export function useIntegrationSync(): UseSyncSchedulerReturn {
|
||||
return;
|
||||
}
|
||||
|
||||
// Mark as in-flight
|
||||
inFlightRef.current.add(integrationId);
|
||||
const startTime = Date.now();
|
||||
|
||||
if (!mountedRef.current) {
|
||||
inFlightRef.current.delete(integrationId);
|
||||
return;
|
||||
}
|
||||
setSyncStates((prev) => ({
|
||||
@@ -199,6 +208,7 @@ export function useIntegrationSync(): UseSyncSchedulerReturn {
|
||||
const result = await performSync(integration.integration_id);
|
||||
|
||||
if (!mountedRef.current) {
|
||||
inFlightRef.current.delete(integrationId);
|
||||
return;
|
||||
}
|
||||
const now = Date.now();
|
||||
@@ -243,6 +253,7 @@ export function useIntegrationSync(): UseSyncSchedulerReturn {
|
||||
'This integration was removed because it no longer exists on the server. Please reconnect if needed.',
|
||||
variant: 'destructive',
|
||||
});
|
||||
inFlightRef.current.delete(integrationId);
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -282,7 +293,9 @@ export function useIntegrationSync(): UseSyncSchedulerReturn {
|
||||
} else {
|
||||
sendNotification('error', integration.name, result.error);
|
||||
}
|
||||
inFlightRef.current.delete(integrationId);
|
||||
} catch (error) {
|
||||
inFlightRef.current.delete(integrationId);
|
||||
if (!mountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -75,7 +75,7 @@ function saveNotifiedReminder(reminderId: string): void {
|
||||
export function useMeetingReminders(events: CalendarEvent[]) {
|
||||
const [permission, setPermission] = useState<NotificationPermission>('default');
|
||||
const [settings, setSettings] = useState<ReminderSettings>(loadSettings);
|
||||
const checkIntervalRef = useRef<NodeJS.Timeout | null>(null);
|
||||
const checkIntervalRef = useRef<ReturnType<typeof setInterval> | null>(null);
|
||||
|
||||
// Check current notification permission
|
||||
useEffect(() => {
|
||||
|
||||
@@ -8,11 +8,7 @@ import { isTauriEnvironment } from '@/api/tauri-adapter';
|
||||
import type { OAuthConnection } from '@/api/types';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
import { toastError } from '@/lib/error-reporting';
|
||||
import {
|
||||
extractOAuthCallback,
|
||||
setupDeepLinkListener,
|
||||
validateOAuthState,
|
||||
} from '@/lib/oauth-utils';
|
||||
import { extractOAuthCallback, setupDeepLinkListener, validateOAuthState } from '@/lib/oauth-utils';
|
||||
|
||||
export type OAuthFlowStatus =
|
||||
| 'idle'
|
||||
|
||||
@@ -35,7 +35,7 @@ const { mockAPI, mockGetAPI } = vi.hoisted(() => {
|
||||
});
|
||||
|
||||
// Mock the API module
|
||||
vi.mock('@/api', () => ({
|
||||
vi.mock('@/api/interface', () => ({
|
||||
getAPI: mockGetAPI,
|
||||
}));
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
*/
|
||||
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import type { DiarizationJobStatus, MeetingState, ProcessingStatus } from '@/api/types';
|
||||
import { toast } from '@/hooks/use-toast';
|
||||
import { setEntitiesFromExtraction } from '@/lib/entity-store';
|
||||
@@ -63,9 +63,13 @@ export function usePostProcessing(options: UsePostProcessingOptions = {}): UsePo
|
||||
const pollStartTimeRef = useRef<number | null>(null);
|
||||
const diarizationJobIdRef = useRef<string | null>(null);
|
||||
const completedMeetingRef = useRef<string | null>(null);
|
||||
// Generation token to invalidate in-flight polls on stop/reset
|
||||
const pollGenerationRef = useRef(0);
|
||||
|
||||
/** Stop diarization polling */
|
||||
const stopPolling = useCallback(() => {
|
||||
// Increment generation to invalidate any in-flight polls
|
||||
pollGenerationRef.current++;
|
||||
if (pollTimeoutRef.current) {
|
||||
clearTimeout(pollTimeoutRef.current);
|
||||
pollTimeoutRef.current = null;
|
||||
@@ -94,8 +98,9 @@ export function usePostProcessing(options: UsePostProcessingOptions = {}): UsePo
|
||||
|
||||
/** Poll for diarization job status */
|
||||
const pollDiarization = useCallback(
|
||||
async (jobId: string) => {
|
||||
if (!isMountedRef.current) {
|
||||
async (jobId: string, generation: number) => {
|
||||
// Check if this poll has been invalidated (generation changed)
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -119,7 +124,8 @@ export function usePostProcessing(options: UsePostProcessingOptions = {}): UsePo
|
||||
const api = getAPI();
|
||||
const status: DiarizationJobStatus = await api.getDiarizationJobStatus(jobId);
|
||||
|
||||
if (!isMountedRef.current) {
|
||||
// Re-check generation after async work
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -167,23 +173,27 @@ export function usePostProcessing(options: UsePostProcessingOptions = {}): UsePo
|
||||
return;
|
||||
}
|
||||
|
||||
// Continue polling with backoff
|
||||
// Continue polling with backoff (pass generation for cancellation check)
|
||||
currentPollIntervalRef.current = Math.min(
|
||||
currentPollIntervalRef.current * POLL_BACKOFF_MULTIPLIER,
|
||||
MAX_POLL_INTERVAL_MS
|
||||
);
|
||||
pollTimeoutRef.current = setTimeout(
|
||||
() => pollDiarization(jobId),
|
||||
() => pollDiarization(jobId, generation),
|
||||
currentPollIntervalRef.current
|
||||
);
|
||||
} catch {
|
||||
// Re-check generation after async work
|
||||
if (generation !== pollGenerationRef.current || !isMountedRef.current) {
|
||||
return;
|
||||
}
|
||||
// Network error - continue polling
|
||||
currentPollIntervalRef.current = Math.min(
|
||||
currentPollIntervalRef.current * POLL_BACKOFF_MULTIPLIER,
|
||||
MAX_POLL_INTERVAL_MS
|
||||
);
|
||||
pollTimeoutRef.current = setTimeout(
|
||||
() => pollDiarization(jobId),
|
||||
() => pollDiarization(jobId, generation),
|
||||
currentPollIntervalRef.current
|
||||
);
|
||||
}
|
||||
@@ -317,7 +327,11 @@ export function usePostProcessing(options: UsePostProcessingOptions = {}): UsePo
|
||||
if (response.status === 'queued' || response.status === 'running') {
|
||||
diarizationJobIdRef.current = response.job_id;
|
||||
pollStartTimeRef.current = Date.now();
|
||||
pollTimeoutRef.current = setTimeout(() => pollDiarization(response.job_id), pollInterval);
|
||||
const generation = pollGenerationRef.current;
|
||||
pollTimeoutRef.current = setTimeout(
|
||||
() => pollDiarization(response.job_id, generation),
|
||||
pollInterval
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
// Hook for loading project members
|
||||
|
||||
import { useCallback, useEffect, useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import { extractErrorMessage } from '@/api/helpers';
|
||||
import type { ProjectMembership } from '@/api/types';
|
||||
|
||||
@@ -10,19 +10,37 @@ export function useProjectMembers(projectId?: string) {
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
// Stale-request protection: track current request ID and mount state
|
||||
const requestIdRef = useRef(0);
|
||||
const isMountedRef = useRef(true);
|
||||
|
||||
const loadMembers = useCallback(async () => {
|
||||
if (!projectId) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Increment request ID to invalidate any in-flight requests
|
||||
const thisRequestId = ++requestIdRef.current;
|
||||
|
||||
setIsLoading(true);
|
||||
setError(null);
|
||||
try {
|
||||
const response = await getAPI().listProjectMembers({ project_id: projectId, limit: 100 });
|
||||
setMembers(response.members);
|
||||
|
||||
// Only update state if this is still the current request and component is mounted
|
||||
if (thisRequestId === requestIdRef.current && isMountedRef.current) {
|
||||
setMembers(response.members);
|
||||
}
|
||||
} catch (err) {
|
||||
setError(extractErrorMessage(err, 'Failed to load project members'));
|
||||
// Only update state if this is still the current request and component is mounted
|
||||
if (thisRequestId === requestIdRef.current && isMountedRef.current) {
|
||||
setError(extractErrorMessage(err, 'Failed to load project members'));
|
||||
}
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
// Only update loading state if this is still the current request and component is mounted
|
||||
if (thisRequestId === requestIdRef.current && isMountedRef.current) {
|
||||
setIsLoading(false);
|
||||
}
|
||||
}
|
||||
}, [projectId]);
|
||||
|
||||
@@ -30,5 +48,13 @@ export function useProjectMembers(projectId?: string) {
|
||||
void loadMembers();
|
||||
}, [loadMembers]);
|
||||
|
||||
// Mount/unmount tracking
|
||||
useEffect(() => {
|
||||
isMountedRef.current = true;
|
||||
return () => {
|
||||
isMountedRef.current = false;
|
||||
};
|
||||
}, []);
|
||||
|
||||
return { members, isLoading, error, refresh: loadMembers };
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { useCallback, useEffect, useMemo, useRef, useState } from 'react';
|
||||
import { getAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import { isTauriEnvironment } from '@/api/tauri-adapter';
|
||||
import type {
|
||||
AppMatcher,
|
||||
@@ -231,9 +231,7 @@ export function useRecordingAppPolicy(): RecordingAppPolicyState {
|
||||
// Append new apps, avoiding duplicates
|
||||
setInstalledApps((prev) => {
|
||||
const existingIds = new Set(prev.map((app) => canonicalIdForApp(app)));
|
||||
const newApps = response.apps.filter(
|
||||
(app) => !existingIds.has(canonicalIdForApp(app))
|
||||
);
|
||||
const newApps = response.apps.filter((app) => !existingIds.has(canonicalIdForApp(app)));
|
||||
return [...prev, ...newApps];
|
||||
});
|
||||
currentPageRef.current += 1;
|
||||
|
||||
@@ -4,13 +4,9 @@
|
||||
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
|
||||
import {
|
||||
getAPI,
|
||||
isTauriEnvironment,
|
||||
mockAPI,
|
||||
type NoteFlowAPI,
|
||||
type TranscriptionStream,
|
||||
} from '@/api';
|
||||
import { isTauriEnvironment, mockAPI } from '@/api';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import type { NoteFlowAPI, TranscriptionStream } from '@/api';
|
||||
import { TauriEvents } from '@/api/tauri-adapter';
|
||||
import type { FinalSegment, Meeting, TranscriptUpdate } from '@/api/types';
|
||||
import { useConnectionState } from '@/contexts/connection-state';
|
||||
@@ -224,7 +220,10 @@ export function useRecordingSession(
|
||||
const streamState = await getAPI().getStreamState();
|
||||
if (streamState.state === 'starting' && (streamState.started_at_secs_ago ?? 0) > 10) {
|
||||
await getAPI().resetStreamState();
|
||||
toast({ title: 'Stream recovered', description: 'A stuck stream was automatically reset.' });
|
||||
toast({
|
||||
title: 'Stream recovered',
|
||||
description: 'A stuck stream was automatically reset.',
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
addClientLog({
|
||||
|
||||
@@ -56,7 +56,11 @@ export function useSecureIntegrationSecrets() {
|
||||
source: 'app',
|
||||
message: `Secret retrieval failed for integration field`,
|
||||
details: error instanceof Error ? error.message : String(error),
|
||||
metadata: { context: 'secure_integration_secrets', integration_id: integration.id, field },
|
||||
metadata: {
|
||||
context: 'secure_integration_secrets',
|
||||
integration_id: integration.id,
|
||||
field,
|
||||
},
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -81,8 +85,17 @@ export function useSecureIntegrationSecrets() {
|
||||
for (const field of fields) {
|
||||
const value = getNestedValue(integration, field);
|
||||
const secretKey = getSecretKey(integration.id, field);
|
||||
|
||||
// Enforce string type - only store non-empty strings, delete otherwise
|
||||
const stringValue = typeof value === 'string' ? value : '';
|
||||
|
||||
try {
|
||||
await setSecureValue(secretKey, value);
|
||||
if (stringValue) {
|
||||
await setSecureValue(secretKey, stringValue);
|
||||
} else {
|
||||
// Clear the secret if value is empty/undefined/non-string
|
||||
await setSecureValue(secretKey, '');
|
||||
}
|
||||
} catch (error) {
|
||||
// Secret storage failed - integration may work without persisted secret
|
||||
addClientLog({
|
||||
@@ -90,7 +103,11 @@ export function useSecureIntegrationSecrets() {
|
||||
source: 'app',
|
||||
message: `Secret storage failed for integration field`,
|
||||
details: error instanceof Error ? error.message : String(error),
|
||||
metadata: { context: 'secure_integration_secrets', integration_id: integration.id, field },
|
||||
metadata: {
|
||||
context: 'secure_integration_secrets',
|
||||
integration_id: integration.id,
|
||||
field,
|
||||
},
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -117,7 +134,11 @@ export function useSecureIntegrationSecrets() {
|
||||
source: 'app',
|
||||
message: `Secret clearing failed for integration field`,
|
||||
details: error instanceof Error ? error.message : String(error),
|
||||
metadata: { context: 'secure_integration_secrets', integration_id: integration.id, field },
|
||||
metadata: {
|
||||
context: 'secure_integration_secrets',
|
||||
integration_id: integration.id,
|
||||
field,
|
||||
},
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -138,28 +159,36 @@ export function useSecureIntegrationSecrets() {
|
||||
/**
|
||||
* Check secure storage health and attempt migration if needed.
|
||||
*
|
||||
* @returns Object with health status and whether migration was attempted/successful
|
||||
* Migration state semantics:
|
||||
* - 'not_needed': Storage was already healthy, no migration required
|
||||
* - 'succeeded': Migration was performed and succeeded
|
||||
* - 'failed': Migration was attempted but failed
|
||||
*
|
||||
* @returns Object with health status and migration result
|
||||
*/
|
||||
const checkHealthAndMigrate = useCallback(async (): Promise<{
|
||||
status: SecureStorageStatus;
|
||||
migrationAttempted: boolean;
|
||||
migrationSucceeded: boolean;
|
||||
migrationState: 'not_needed' | 'succeeded' | 'failed';
|
||||
}> => {
|
||||
if (!available) {
|
||||
return { status: 'unavailable', migrationAttempted: false, migrationSucceeded: false };
|
||||
return { status: 'unavailable', migrationState: 'not_needed' };
|
||||
}
|
||||
|
||||
// First, attempt migration (only runs if needed)
|
||||
const migrationSucceeded = await migrateSecureStorage();
|
||||
const migrationAttempted = !migrationSucceeded; // If it failed, migration was actually attempted
|
||||
// Check pre-migration health to determine if migration is needed
|
||||
const preStatus = await checkSecureStorageHealth();
|
||||
if (preStatus === 'healthy') {
|
||||
return { status: 'healthy', migrationState: 'not_needed' };
|
||||
}
|
||||
|
||||
// Then check current health
|
||||
const status = await checkSecureStorageHealth();
|
||||
// Attempt migration since storage is not healthy
|
||||
const migrationSucceeded = await migrateSecureStorage();
|
||||
|
||||
// Check post-migration health
|
||||
const postStatus = await checkSecureStorageHealth();
|
||||
|
||||
return {
|
||||
status,
|
||||
migrationAttempted: migrationAttempted || status === 'healthy', // If healthy after migration, it was migrated
|
||||
migrationSucceeded,
|
||||
status: postStatus,
|
||||
migrationState: migrationSucceeded ? 'succeeded' : 'failed',
|
||||
};
|
||||
}, [available]);
|
||||
|
||||
|
||||
@@ -7,10 +7,7 @@
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
import { getAPI } from '@/api/interface';
|
||||
import { extractErrorMessage } from '@/api/helpers';
|
||||
import type {
|
||||
StreamingConfiguration,
|
||||
UpdateStreamingConfigurationRequest,
|
||||
} from '@/api/types';
|
||||
import type { StreamingConfiguration, UpdateStreamingConfigurationRequest } from '@/api/types';
|
||||
import { useConnectionState } from '@/contexts/connection-state';
|
||||
|
||||
type LoadState = 'idle' | 'loading' | 'ready' | 'failed';
|
||||
|
||||
@@ -1,28 +1,19 @@
|
||||
/**
|
||||
* Model fetching functions for AI providers.
|
||||
*
|
||||
* Uses the strategy pattern to delegate to provider-specific implementations.
|
||||
*/
|
||||
|
||||
import { extractErrorMessage, getErrorMessage, isRecord } from '@/api/helpers';
|
||||
import type { AIProviderType, ModelCatalogEntry, TranscriptionProviderType } from '@/api/types';
|
||||
import type { AIProviderType, TranscriptionProviderType } from '@/api/types';
|
||||
|
||||
import {
|
||||
ANTHROPIC_API_VERSION,
|
||||
AZURE_OPENAI_API_VERSION,
|
||||
errorResult,
|
||||
successResult,
|
||||
} from './constants';
|
||||
import { errorResult } from './constants';
|
||||
import {
|
||||
getCachedModelCatalog,
|
||||
isModelCatalogFresh,
|
||||
setCachedModelCatalog,
|
||||
} from './model-catalog-cache';
|
||||
import {
|
||||
dedupeAndSortModels,
|
||||
extractModelEntries,
|
||||
filterGoogleModel,
|
||||
filterOpenAIModel,
|
||||
type ModelCatalogType,
|
||||
} from './model-catalog-utils';
|
||||
import { dedupeAndSortModels, type ModelCatalogType } from './model-catalog-utils';
|
||||
import { getStrategy, requiresApiKey as strategyRequiresApiKey } from './strategies';
|
||||
import type { FetchModelsResult } from './types';
|
||||
|
||||
type ConfigType = ModelCatalogType;
|
||||
@@ -31,277 +22,23 @@ type FetchModelsOptions = {
|
||||
forceRefresh?: boolean;
|
||||
};
|
||||
|
||||
const AZURE_SPEECH_API_VERSIONS = ['v3.2', 'v3.1'];
|
||||
|
||||
/** Fetch models from OpenAI-compatible API. */
|
||||
async function fetchOpenAIModels(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
type: ConfigType
|
||||
): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/models`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${apiKey}`,
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorPayload: unknown = await response.json().catch(() => null);
|
||||
return errorResult(getErrorMessage(errorPayload) || `HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.data) ? data.data : [];
|
||||
const models = extractModelEntries(items, ['id'])
|
||||
.filter((model) => filterOpenAIModel(model.id, type));
|
||||
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch models'));
|
||||
}
|
||||
}
|
||||
|
||||
/** Fetch Anthropic models. */
|
||||
async function fetchAnthropicModels(baseUrl: string, apiKey: string): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/models`, {
|
||||
headers: {
|
||||
'x-api-key': apiKey,
|
||||
'anthropic-version': ANTHROPIC_API_VERSION,
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.data) ? data.data : [];
|
||||
const models = extractModelEntries(items, ['id', 'name']);
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch models'));
|
||||
}
|
||||
}
|
||||
|
||||
/** Fetch models from Google AI API. */
|
||||
async function fetchGoogleModels(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
type: ConfigType
|
||||
): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/models?key=${apiKey}`);
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.models) ? data.models : [];
|
||||
const models = extractModelEntries(items, ['name'], (name) => name.replace(/^models\//, ''))
|
||||
.filter((model) => filterGoogleModel(model.id, type));
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch models'));
|
||||
}
|
||||
}
|
||||
|
||||
/** Fetch models from local Ollama instance. */
|
||||
async function fetchOllamaModels(baseUrl: string): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/tags`);
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.models) ? data.models : [];
|
||||
const models = extractModelEntries(items, ['name']);
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Could not connect to Ollama'));
|
||||
}
|
||||
}
|
||||
|
||||
function extractDeepgramModelEntries(data: unknown): ModelCatalogEntry[] {
|
||||
if (!isRecord(data)) {
|
||||
return [];
|
||||
}
|
||||
const directArray = Array.isArray(data.models) ? data.models : [];
|
||||
if (directArray.length > 0) {
|
||||
return extractModelEntries(directArray, ['name', 'id', 'model_id']);
|
||||
}
|
||||
const sttModels = Array.isArray(data.stt) ? data.stt : [];
|
||||
if (sttModels.length > 0) {
|
||||
return extractModelEntries(sttModels, ['name', 'id', 'model_id']);
|
||||
}
|
||||
if (isRecord(data.models) && Array.isArray(data.models.stt)) {
|
||||
return extractModelEntries(data.models.stt, ['name', 'id', 'model_id']);
|
||||
}
|
||||
return [];
|
||||
}
|
||||
|
||||
/** Fetch Deepgram models. */
|
||||
async function fetchDeepgramModels(baseUrl: string, apiKey: string): Promise<FetchModelsResult> {
|
||||
const base = baseUrl.replace(/\/v1\/?$/, '');
|
||||
try {
|
||||
const projectsResponse = await fetch(`${base}/v1/projects`, {
|
||||
headers: { Authorization: `Token ${apiKey}` },
|
||||
});
|
||||
if (projectsResponse.ok) {
|
||||
const projectsData: unknown = await projectsResponse.json();
|
||||
const projects = isRecord(projectsData) && Array.isArray(projectsData.projects)
|
||||
? projectsData.projects
|
||||
: [];
|
||||
const projectId = projects
|
||||
.map((project) => (isRecord(project) ? project.project_id ?? project.id : null))
|
||||
.find((id): id is string => typeof id === 'string' && id.length > 0);
|
||||
if (projectId) {
|
||||
const modelsResponse = await fetch(`${base}/v1/projects/${projectId}/models`, {
|
||||
headers: { Authorization: `Token ${apiKey}` },
|
||||
});
|
||||
if (modelsResponse.ok) {
|
||||
const modelsData: unknown = await modelsResponse.json();
|
||||
const models = extractDeepgramModelEntries(modelsData);
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// Ignore and fall back to public models endpoint.
|
||||
}
|
||||
|
||||
try {
|
||||
const response = await fetch(`${base}/v1/models`, {
|
||||
headers: { Authorization: `Token ${apiKey}` },
|
||||
});
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
const data: unknown = await response.json();
|
||||
const models = extractDeepgramModelEntries(data);
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch Deepgram models'));
|
||||
}
|
||||
}
|
||||
|
||||
/** Fetch models from ElevenLabs API. */
|
||||
async function fetchElevenLabsModels(
|
||||
baseUrl: string,
|
||||
apiKey: string
|
||||
): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/models`, { headers: { 'xi-api-key': apiKey } });
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
const data: unknown = await response.json();
|
||||
const items = Array.isArray(data) ? data : [];
|
||||
const models: ModelCatalogEntry[] = [];
|
||||
for (const item of items) {
|
||||
if (!isRecord(item) || item.can_do_text_to_speech === false) {
|
||||
continue;
|
||||
}
|
||||
models.push(...extractModelEntries([item], ['model_id', 'id', 'name']));
|
||||
}
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch models'));
|
||||
}
|
||||
}
|
||||
|
||||
/** Fetch Azure OpenAI deployment list. */
|
||||
async function fetchAzureOpenAIModels(
|
||||
baseUrl: string,
|
||||
apiKey: string
|
||||
): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(
|
||||
`${baseUrl.replace(/\/+$/, '')}/openai/deployments?api-version=${AZURE_OPENAI_API_VERSION}`,
|
||||
{
|
||||
headers: { 'api-key': apiKey },
|
||||
}
|
||||
);
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.data)
|
||||
? data.data
|
||||
: isRecord(data) && Array.isArray(data.value)
|
||||
? data.value
|
||||
: [];
|
||||
const models = extractModelEntries(items, ['id', 'name', 'deployment_name']);
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch deployments'));
|
||||
}
|
||||
}
|
||||
|
||||
/** Fetch Azure Speech-to-text models. */
|
||||
async function fetchAzureSpeechModels(
|
||||
baseUrl: string,
|
||||
apiKey: string
|
||||
): Promise<FetchModelsResult> {
|
||||
const trimmed = baseUrl.replace(/\/+$/, '');
|
||||
for (const version of AZURE_SPEECH_API_VERSIONS) {
|
||||
try {
|
||||
const response = await fetch(`${trimmed}/speechtotext/${version}/models/base`, {
|
||||
headers: { 'Ocp-Apim-Subscription-Key': apiKey },
|
||||
});
|
||||
if (!response.ok) {
|
||||
continue;
|
||||
}
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.values)
|
||||
? data.values
|
||||
: isRecord(data) && Array.isArray(data.models)
|
||||
? data.models
|
||||
: [];
|
||||
const models = extractModelEntries(items, ['shortName', 'name', 'id']);
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch {
|
||||
// Try the next API version.
|
||||
}
|
||||
}
|
||||
return errorResult('Azure Speech endpoint not reachable');
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch models from the provider using its strategy.
|
||||
*/
|
||||
async function fetchModelsFromProvider(
|
||||
provider: AIProviderType | TranscriptionProviderType,
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
type: ConfigType
|
||||
): Promise<FetchModelsResult> {
|
||||
switch (provider) {
|
||||
case 'openai':
|
||||
case 'whisper':
|
||||
case 'custom':
|
||||
return fetchOpenAIModels(baseUrl, apiKey, type);
|
||||
const isTranscription = type === 'transcription';
|
||||
const strategy = getStrategy(provider, isTranscription);
|
||||
|
||||
case 'anthropic':
|
||||
return fetchAnthropicModels(baseUrl, apiKey);
|
||||
|
||||
case 'google':
|
||||
return fetchGoogleModels(baseUrl, apiKey, type);
|
||||
|
||||
case 'ollama':
|
||||
return fetchOllamaModels(baseUrl);
|
||||
|
||||
case 'azure':
|
||||
return type === 'transcription'
|
||||
? fetchAzureSpeechModels(baseUrl, apiKey)
|
||||
: fetchAzureOpenAIModels(baseUrl, apiKey);
|
||||
|
||||
case 'deepgram':
|
||||
return fetchDeepgramModels(baseUrl, apiKey);
|
||||
|
||||
case 'elevenlabs':
|
||||
return fetchElevenLabsModels(baseUrl, apiKey);
|
||||
|
||||
default:
|
||||
return errorResult('Unknown provider');
|
||||
if (!strategy) {
|
||||
return errorResult('Unknown provider');
|
||||
}
|
||||
|
||||
return strategy.fetchModels(baseUrl, apiKey, type);
|
||||
}
|
||||
|
||||
/** Fetch available models from the specified AI provider (with caching). */
|
||||
@@ -316,8 +53,9 @@ export async function fetchModels(
|
||||
if (!normalizedBaseUrl) {
|
||||
return errorResult('Base URL is required');
|
||||
}
|
||||
const requiresApiKey = provider !== 'ollama' && provider !== 'custom';
|
||||
if (requiresApiKey && !apiKey) {
|
||||
|
||||
const needsApiKey = strategyRequiresApiKey(provider);
|
||||
if (needsApiKey && !apiKey) {
|
||||
return errorResult('API key is required');
|
||||
}
|
||||
|
||||
|
||||
@@ -70,7 +70,7 @@ function extractCostLabel(record: Record<string, unknown>): string | undefined {
|
||||
}
|
||||
|
||||
if (isRecord(record.pricing)) {
|
||||
const {pricing} = record;
|
||||
const { pricing } = record;
|
||||
const input = getStringOrNumber(pricing, ['prompt', 'input', 'input_cost', 'prompt_cost']);
|
||||
const output = getStringOrNumber(pricing, [
|
||||
'completion',
|
||||
|
||||
83
client/src/lib/ai-providers/strategies/anthropic.ts
Normal file
83
client/src/lib/ai-providers/strategies/anthropic.ts
Normal file
@@ -0,0 +1,83 @@
|
||||
/**
|
||||
* Anthropic Provider Strategy
|
||||
*
|
||||
* Handles model fetching and endpoint testing for the Anthropic Claude API.
|
||||
*/
|
||||
|
||||
import { HttpStatus } from '@/api/constants';
|
||||
import { extractErrorMessage, isRecord } from '@/api/helpers';
|
||||
|
||||
import { ANTHROPIC_API_VERSION, errorResult, successResult } from '../constants';
|
||||
import { dedupeAndSortModels, extractModelEntries } from '../model-catalog-utils';
|
||||
import type { FetchModelsResult, TestEndpointResult } from '../types';
|
||||
|
||||
import { BaseProviderStrategy, type EndpointTestType, type ModelCatalogType } from './types';
|
||||
|
||||
/**
|
||||
* Anthropic provider strategy implementation.
|
||||
*/
|
||||
export class AnthropicStrategy extends BaseProviderStrategy {
|
||||
readonly providerId = 'anthropic';
|
||||
readonly displayName = 'Anthropic';
|
||||
readonly requiresApiKey = true;
|
||||
|
||||
async fetchModels(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
_type: ModelCatalogType
|
||||
): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/models`, {
|
||||
headers: {
|
||||
'x-api-key': apiKey,
|
||||
'anthropic-version': ANTHROPIC_API_VERSION,
|
||||
},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.data) ? data.data : [];
|
||||
const models = extractModelEntries(items, ['id', 'name']);
|
||||
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch models'));
|
||||
}
|
||||
}
|
||||
|
||||
async testEndpoint(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
model: string,
|
||||
_type: EndpointTestType,
|
||||
startTime: number
|
||||
): Promise<TestEndpointResult> {
|
||||
const response = await fetch(`${baseUrl}/messages`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'x-api-key': apiKey,
|
||||
'anthropic-version': ANTHROPIC_API_VERSION,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
model: model,
|
||||
max_tokens: 10,
|
||||
messages: [{ role: 'user', content: 'Say "ok"' }],
|
||||
}),
|
||||
});
|
||||
|
||||
// Rate limited but key is valid
|
||||
if (response.ok || response.status === HttpStatus.TOO_MANY_REQUESTS) {
|
||||
const message =
|
||||
response.status === HttpStatus.TOO_MANY_REQUESTS
|
||||
? 'API key valid (rate limited)'
|
||||
: 'Connection successful';
|
||||
return this.successTestResult(message, startTime);
|
||||
}
|
||||
|
||||
return this.failTestResult(`HTTP ${response.status}`);
|
||||
}
|
||||
}
|
||||
141
client/src/lib/ai-providers/strategies/azure.ts
Normal file
141
client/src/lib/ai-providers/strategies/azure.ts
Normal file
@@ -0,0 +1,141 @@
|
||||
/**
|
||||
* Azure Provider Strategies
|
||||
*
|
||||
* Handles model fetching and endpoint testing for Azure OpenAI and Azure Speech services.
|
||||
*/
|
||||
|
||||
import { extractErrorMessage, isRecord } from '@/api/helpers';
|
||||
|
||||
import { AZURE_OPENAI_API_VERSION, errorResult, successResult } from '../constants';
|
||||
import { dedupeAndSortModels, extractModelEntries } from '../model-catalog-utils';
|
||||
import type { FetchModelsResult, TestEndpointResult } from '../types';
|
||||
|
||||
import { BaseProviderStrategy, type EndpointTestType, type ModelCatalogType } from './types';
|
||||
|
||||
/** Azure Speech API versions to try (in order). */
|
||||
const AZURE_SPEECH_API_VERSIONS = ['v3.2', 'v3.1'];
|
||||
|
||||
/**
|
||||
* Azure OpenAI provider strategy implementation.
|
||||
* Handles Azure-hosted OpenAI deployments.
|
||||
*/
|
||||
export class AzureOpenAIStrategy extends BaseProviderStrategy {
|
||||
readonly providerId = 'azure';
|
||||
readonly displayName = 'Azure OpenAI';
|
||||
readonly requiresApiKey = true;
|
||||
|
||||
async fetchModels(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
_type: ModelCatalogType
|
||||
): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(
|
||||
`${baseUrl.replace(/\/+$/, '')}/openai/deployments?api-version=${AZURE_OPENAI_API_VERSION}`,
|
||||
{
|
||||
headers: { 'api-key': apiKey },
|
||||
}
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
return errorResult(`HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
const data: unknown = await response.json();
|
||||
const items =
|
||||
isRecord(data) && Array.isArray(data.data)
|
||||
? data.data
|
||||
: isRecord(data) && Array.isArray(data.value)
|
||||
? data.value
|
||||
: [];
|
||||
const models = extractModelEntries(items, ['id', 'name', 'deployment_name']);
|
||||
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch deployments'));
|
||||
}
|
||||
}
|
||||
|
||||
async testEndpoint(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
_model: string,
|
||||
_type: EndpointTestType,
|
||||
startTime: number
|
||||
): Promise<TestEndpointResult> {
|
||||
// Azure OpenAI uses custom endpoint; test by fetching models
|
||||
const response = await fetch(`${baseUrl.replace(/\/+$/, '')}/models`, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${apiKey}`,
|
||||
},
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
return this.successTestResult('Endpoint is responding', startTime);
|
||||
}
|
||||
|
||||
return this.failTestResult(`HTTP ${response.status}`);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Azure Speech provider strategy implementation.
|
||||
* Handles Azure Cognitive Services Speech-to-Text.
|
||||
*/
|
||||
export class AzureSpeechStrategy extends BaseProviderStrategy {
|
||||
readonly providerId = 'azure-speech';
|
||||
readonly displayName = 'Azure Speech';
|
||||
readonly requiresApiKey = true;
|
||||
|
||||
async fetchModels(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
_type: ModelCatalogType
|
||||
): Promise<FetchModelsResult> {
|
||||
const trimmed = baseUrl.replace(/\/+$/, '');
|
||||
|
||||
for (const version of AZURE_SPEECH_API_VERSIONS) {
|
||||
try {
|
||||
const response = await fetch(`${trimmed}/speechtotext/${version}/models/base`, {
|
||||
headers: { 'Ocp-Apim-Subscription-Key': apiKey },
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const data: unknown = await response.json();
|
||||
const items =
|
||||
isRecord(data) && Array.isArray(data.values)
|
||||
? data.values
|
||||
: isRecord(data) && Array.isArray(data.models)
|
||||
? data.models
|
||||
: [];
|
||||
const models = extractModelEntries(items, ['shortName', 'name', 'id']);
|
||||
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch {
|
||||
// Try the next API version.
|
||||
}
|
||||
}
|
||||
|
||||
return errorResult('Azure Speech endpoint not reachable');
|
||||
}
|
||||
|
||||
async testEndpoint(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
_model: string,
|
||||
_type: EndpointTestType,
|
||||
startTime: number
|
||||
): Promise<TestEndpointResult> {
|
||||
// Test by attempting to fetch models
|
||||
const result = await this.fetchModels(baseUrl, apiKey, 'transcription');
|
||||
|
||||
if (result.success) {
|
||||
return this.successTestResult('Azure Speech endpoint is working', startTime);
|
||||
}
|
||||
|
||||
return this.failTestResult(result.error ?? 'Azure Speech endpoint not reachable');
|
||||
}
|
||||
}
|
||||
124
client/src/lib/ai-providers/strategies/custom.ts
Normal file
124
client/src/lib/ai-providers/strategies/custom.ts
Normal file
@@ -0,0 +1,124 @@
|
||||
/**
|
||||
* Custom Provider Strategy
|
||||
*
|
||||
* Handles model fetching and endpoint testing for custom/unknown OpenAI-compatible APIs.
|
||||
*/
|
||||
|
||||
import { extractErrorMessage, getErrorMessage, isRecord } from '@/api/helpers';
|
||||
|
||||
import { errorResult, successResult } from '../constants';
|
||||
import {
|
||||
dedupeAndSortModels,
|
||||
extractModelEntries,
|
||||
filterOpenAIModel,
|
||||
} from '../model-catalog-utils';
|
||||
import type { FetchModelsResult, TestEndpointResult } from '../types';
|
||||
|
||||
import { BaseProviderStrategy, type EndpointTestType, type ModelCatalogType } from './types';
|
||||
|
||||
/**
|
||||
* Custom/OpenAI-compatible provider strategy implementation.
|
||||
* Assumes OpenAI-compatible API format.
|
||||
*/
|
||||
export class CustomStrategy extends BaseProviderStrategy {
|
||||
readonly providerId = 'custom';
|
||||
readonly displayName = 'Custom';
|
||||
readonly requiresApiKey = false;
|
||||
|
||||
async fetchModels(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
type: ModelCatalogType
|
||||
): Promise<FetchModelsResult> {
|
||||
try {
|
||||
const response = await fetch(`${baseUrl}/models`, {
|
||||
headers: apiKey
|
||||
? {
|
||||
Authorization: `Bearer ${apiKey}`,
|
||||
}
|
||||
: {},
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorPayload: unknown = await response.json().catch(() => null);
|
||||
return errorResult(getErrorMessage(errorPayload) || `HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
const data: unknown = await response.json();
|
||||
const items = isRecord(data) && Array.isArray(data.data) ? data.data : [];
|
||||
const models = extractModelEntries(items, ['id']).filter((model) =>
|
||||
filterOpenAIModel(model.id, type)
|
||||
);
|
||||
|
||||
return successResult(dedupeAndSortModels(models));
|
||||
} catch (error: unknown) {
|
||||
return errorResult(extractErrorMessage(error, 'Failed to fetch models'));
|
||||
}
|
||||
}
|
||||
|
||||
async testEndpoint(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
model: string,
|
||||
type: EndpointTestType,
|
||||
startTime: number
|
||||
): Promise<TestEndpointResult> {
|
||||
if (type === 'transcription') {
|
||||
return this.testCustomTranscription(baseUrl, apiKey, startTime);
|
||||
}
|
||||
|
||||
// For summary/embedding, use OpenAI-compatible chat completions
|
||||
return this.testChatEndpoint(baseUrl, apiKey, model, startTime);
|
||||
}
|
||||
|
||||
private async testCustomTranscription(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
startTime: number
|
||||
): Promise<TestEndpointResult> {
|
||||
const response = await fetch(`${baseUrl}/models`, {
|
||||
headers: apiKey
|
||||
? {
|
||||
Authorization: `Bearer ${apiKey}`,
|
||||
}
|
||||
: {},
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
return this.successTestResult('Endpoint is responding', startTime);
|
||||
}
|
||||
|
||||
return this.failTestResult(`HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
private async testChatEndpoint(
|
||||
baseUrl: string,
|
||||
apiKey: string,
|
||||
model: string,
|
||||
startTime: number
|
||||
): Promise<TestEndpointResult> {
|
||||
const response = await fetch(`${baseUrl}/chat/completions`, {
|
||||
method: 'POST',
|
||||
headers: apiKey
|
||||
? {
|
||||
Authorization: `Bearer ${apiKey}`,
|
||||
'Content-Type': 'application/json',
|
||||
}
|
||||
: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
model: model,
|
||||
messages: [{ role: 'user', content: 'Say "test successful" in exactly 2 words.' }],
|
||||
max_tokens: 10,
|
||||
}),
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const errorPayload: unknown = await response.json().catch(() => null);
|
||||
return this.failTestResult(getErrorMessage(errorPayload) || `HTTP ${response.status}`);
|
||||
}
|
||||
|
||||
return this.successTestResult('Chat completion endpoint is working', startTime);
|
||||
}
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user