Update client submodule reference to indicate a dirty state, reflecting uncommitted changes.
This commit is contained in:
146
CLAUDE.md
146
CLAUDE.md
@@ -79,8 +79,8 @@ Dev container features: dbus-x11, GTK-3, libgl1 for system tray and hotkey suppo
|
||||
|
||||
```
|
||||
src/noteflow/
|
||||
├── domain/ # Entities (meeting, segment, annotation, summary, triggers) + ports
|
||||
├── application/ # Use-cases/services (MeetingService, RecoveryService, ExportService, SummarizationService, TriggerService)
|
||||
├── domain/ # Entities (meeting, segment, annotation, summary, triggers, webhooks, integrations) + ports
|
||||
├── application/ # Use-cases/services (MeetingService, RecoveryService, ExportService, SummarizationService, TriggerService, WebhookService)
|
||||
├── infrastructure/ # Implementations
|
||||
│ ├── audio/ # sounddevice capture, ring buffer, VU levels, playback, buffered writer
|
||||
│ ├── asr/ # faster-whisper engine, VAD segmenter, streaming
|
||||
@@ -89,10 +89,11 @@ src/noteflow/
|
||||
│ ├── triggers/ # Auto-start signal providers (calendar, audio activity, foreground app)
|
||||
│ ├── persistence/ # SQLAlchemy + asyncpg + pgvector, Alembic migrations
|
||||
│ ├── security/ # keyring keystore, AES-GCM encryption
|
||||
│ ├── export/ # Markdown/HTML export
|
||||
│ └── converters/ # ORM ↔ domain entity converters
|
||||
│ ├── export/ # Markdown/HTML/PDF export
|
||||
│ ├── webhooks/ # Webhook executor with retry logic and HMAC signing
|
||||
│ └── converters/ # ORM ↔ domain entity converters (including webhook converters)
|
||||
├── grpc/ # Proto definitions, server, client, meeting store, modular mixins
|
||||
└── config/ # Pydantic settings (NOTEFLOW_ env vars)
|
||||
└── config/ # Pydantic settings (NOTEFLOW_ env vars) + feature flags
|
||||
```
|
||||
|
||||
Frontend (Tauri + React) lives outside the Python package:
|
||||
@@ -113,6 +114,25 @@ client/
|
||||
- Protocol-based DI (see `domain/ports/` and infrastructure `protocols.py` files)
|
||||
- Modular gRPC mixins for separation of concerns (see below)
|
||||
|
||||
### Domain Package Structure
|
||||
|
||||
```
|
||||
domain/
|
||||
├── entities/ # Core domain entities
|
||||
│ ├── meeting.py # Meeting, MeetingId, MeetingState
|
||||
│ ├── segment.py # Segment, WordTiming
|
||||
│ ├── summary.py # Summary, KeyPoint, ActionItem
|
||||
│ ├── annotation.py # Annotation
|
||||
│ └── integration.py# Integration, IntegrationType, IntegrationStatus
|
||||
├── webhooks/ # Webhook domain
|
||||
│ └── events.py # WebhookEventType, WebhookConfig, WebhookDelivery, payload classes
|
||||
├── ports/ # Repository protocols
|
||||
│ ├── repositories.py # All repository protocols (MeetingRepository, WebhookRepository, etc.)
|
||||
│ └── unit_of_work.py # UnitOfWork protocol with supports_* capability checks
|
||||
└── utils/ # Domain utilities
|
||||
└── time.py # utc_now() helper
|
||||
```
|
||||
|
||||
## gRPC Mixin Architecture
|
||||
|
||||
The gRPC server uses modular mixins for maintainability:
|
||||
@@ -122,15 +142,25 @@ grpc/_mixins/
|
||||
├── streaming.py # ASR streaming, audio processing, partial buffers
|
||||
├── diarization.py # Speaker diarization jobs (background refinement, job TTL)
|
||||
├── summarization.py # Summary generation (separates LLM inference from DB transactions)
|
||||
├── meeting.py # Meeting lifecycle (create, get, list, delete)
|
||||
├── meeting.py # Meeting lifecycle (create, get, list, delete, stop)
|
||||
├── annotation.py # Segment annotations CRUD
|
||||
├── export.py # Markdown/HTML document export
|
||||
├── export.py # Markdown/HTML/PDF document export
|
||||
├── converters.py # Protobuf ↔ domain entity converters
|
||||
├── errors.py # gRPC error helpers (abort_not_found, abort_invalid_argument)
|
||||
└── protocols.py # ServicerHost protocol for mixin composition
|
||||
```
|
||||
|
||||
Each mixin operates on `ServicerHost` protocol, enabling clean composition in `NoteFlowServicer`.
|
||||
|
||||
### Service Injection Pattern
|
||||
|
||||
Services are injected through a three-tier pattern:
|
||||
1. **ServicerHost Protocol** (`protocols.py`) — declares required service attributes
|
||||
2. **NoteFlowServicer** (`service.py`) — accepts services via `__init__` and stores as instance attributes
|
||||
3. **NoteFlowServer** (`server.py`) — creates/initializes services and passes to servicer
|
||||
|
||||
Example: `_webhook_service`, `_summarization_service`, `_ner_service` all follow this pattern.
|
||||
|
||||
## Client Architecture (Tauri + React)
|
||||
|
||||
- React components are in `client/src/components/`, shared UI types in `client/src/types/`, and Zustand state in `client/src/store/`.
|
||||
@@ -226,6 +256,20 @@ python -m grpc_tools.protoc -I src/noteflow/grpc/proto \
|
||||
- Frontend formatting uses Prettier (single quotes, 100 char width); linting uses Biome.
|
||||
- Rust formatting uses `rustfmt`; linting uses `clippy` via the client scripts.
|
||||
|
||||
## Feature Flags
|
||||
|
||||
Optional features controlled via `NOTEFLOW_FEATURE_*` environment variables:
|
||||
|
||||
| Flag | Default | Controls | Prerequisites |
|
||||
|------|---------|----------|---------------|
|
||||
| `NOTEFLOW_FEATURE_TEMPLATES_ENABLED` | `true` | AI summarization templates | — |
|
||||
| `NOTEFLOW_FEATURE_PDF_EXPORT_ENABLED` | `true` | PDF export format | WeasyPrint installed |
|
||||
| `NOTEFLOW_FEATURE_NER_ENABLED` | `false` | Named entity extraction | spaCy model downloaded |
|
||||
| `NOTEFLOW_FEATURE_CALENDAR_ENABLED` | `false` | Calendar sync | OAuth credentials configured |
|
||||
| `NOTEFLOW_FEATURE_WEBHOOKS_ENABLED` | `true` | Webhook notifications | — |
|
||||
|
||||
Access via `get_settings().features.<flag_name>`. Features with external dependencies default to `false`.
|
||||
|
||||
## Spikes (De-risking Experiments)
|
||||
|
||||
`spikes/` contains validated platform experiments with `FINDINGS.md`:
|
||||
@@ -242,14 +286,61 @@ python -m grpc_tools.protoc -I src/noteflow/grpc/proto \
|
||||
|
||||
### Summarization
|
||||
- **Providers**: CloudProvider (Anthropic/OpenAI), OllamaProvider (local), MockProvider (testing)
|
||||
- **Templates**: Configurable tone (professional/casual/technical), format (bullet_points/narrative/structured), verbosity (minimal/balanced/detailed)
|
||||
- **Citation verification**: Links summary claims to transcript evidence
|
||||
- **Consent**: Cloud providers require explicit user consent (not yet persisted)
|
||||
- **Consent**: Cloud providers require explicit user consent (stored in `user_preferences`)
|
||||
|
||||
### Export
|
||||
- **Formats**: Markdown, HTML, PDF (via WeasyPrint)
|
||||
- **Content**: Transcript with timestamps, speaker labels, summary with key points and action items
|
||||
- **gRPC**: `ExportTranscript` RPC with `ExportFormat` enum
|
||||
- **PDF styling**: Embedded CSS for professional document layout
|
||||
|
||||
### Named Entity Recognition (NER)
|
||||
Automatic extraction of people, companies, products, locations from transcripts.
|
||||
|
||||
- **Engine**: spaCy with transformer models (`en_core_web_sm` or `en_core_web_trf`)
|
||||
- **Categories**: person, company, product, technical, acronym, location, date, other
|
||||
- **Segment tracking**: Entities link back to source `segment_ids` for navigation
|
||||
- **Confidence scores**: Model confidence for each extracted entity
|
||||
- **Pinning**: Users can pin (confirm) entities for future reference
|
||||
- **gRPC**: `ExtractEntities` RPC with optional `force_refresh`
|
||||
- **Caching**: Entities persisted in `named_entities` table, cached until refresh
|
||||
|
||||
### Trigger Detection
|
||||
- **Signals**: Calendar proximity, audio activity, foreground app detection
|
||||
- **Actions**: IGNORE, NOTIFY, AUTO_START with confidence thresholds
|
||||
- **Client integration**: Background polling with dialog prompts (start/snooze/dismiss)
|
||||
|
||||
### Webhooks
|
||||
Automated HTTP notifications for meeting lifecycle events.
|
||||
|
||||
- **Events**: `meeting.completed`, `summary.generated`, `recording.started`, `recording.stopped`
|
||||
- **Delivery**: Exponential backoff retries (configurable `max_retries`, default 3)
|
||||
- **Security**: HMAC-SHA256 signing via `X-NoteFlow-Signature` header when secret configured
|
||||
- **Headers**: `X-NoteFlow-Event` (event type), `X-NoteFlow-Delivery` (unique delivery ID)
|
||||
- **Fire-and-forget**: Webhook failures never block primary RPC operations
|
||||
- **Persistence**: `webhook_configs` stores URL/events/secret, `webhook_deliveries` logs delivery attempts
|
||||
|
||||
Key files:
|
||||
- `domain/webhooks/events.py` — `WebhookEventType`, `WebhookConfig`, `WebhookDelivery`, payload dataclasses
|
||||
- `infrastructure/webhooks/executor.py` — HTTP client with retry logic
|
||||
- `application/services/webhook_service.py` — orchestrates delivery to registered webhooks
|
||||
|
||||
### Integrations
|
||||
OAuth-based external service connections (calendar providers, etc.).
|
||||
|
||||
- **Types**: `calendar` (Google, Outlook)
|
||||
- **Status tracking**: `pending`, `connected`, `error`, `disconnected`
|
||||
- **Secure storage**: OAuth tokens stored in `integration_secrets` table
|
||||
- **Sync history**: `integration_sync_runs` tracks each sync operation
|
||||
|
||||
ORM models in `persistence/models/integrations/`:
|
||||
- `IntegrationModel` — provider config and status
|
||||
- `IntegrationSecretModel` — encrypted OAuth tokens
|
||||
- `CalendarEventModel` — cached calendar events
|
||||
- `MeetingCalendarLinkModel` — links meetings to calendar events
|
||||
|
||||
## Shared Utilities & Factories
|
||||
|
||||
### Factories
|
||||
@@ -268,6 +359,7 @@ python -m grpc_tools.protoc -I src/noteflow/grpc/proto \
|
||||
|----------|----------------|---------|
|
||||
| `infrastructure/converters/orm_converters.py` | `OrmConverter` | ORM ↔ domain entities (Meeting, Segment, Summary, etc.) |
|
||||
| `infrastructure/converters/asr_converters.py` | `AsrConverter` | ASR DTOs → domain WordTiming |
|
||||
| `infrastructure/converters/webhook_converters.py` | `WebhookConverter` | ORM ↔ domain (`WebhookConfig`, `WebhookDelivery`) |
|
||||
| `grpc/_mixins/converters.py` | `meeting_to_proto()`, `segment_to_proto_update()` | Domain → protobuf messages |
|
||||
| `grpc/_mixins/converters.py` | `create_segment_from_asr()` | ASR result → Segment with word timings |
|
||||
|
||||
@@ -315,6 +407,14 @@ python -m grpc_tools.protoc -I src/noteflow/grpc/proto \
|
||||
|----------|---------|
|
||||
| `parse_calendar_events()` | Parse events from config/env |
|
||||
|
||||
### Webhooks (`infrastructure/webhooks/executor.py`)
|
||||
|
||||
| Class/Method | Purpose |
|
||||
|--------------|---------|
|
||||
| `WebhookExecutor` | HTTP delivery with retry logic and HMAC signing |
|
||||
| `WebhookExecutor.deliver()` | Deliver payload to webhook URL with exponential backoff |
|
||||
| `WebhookExecutor._build_headers()` | Build headers including `X-NoteFlow-Signature` |
|
||||
|
||||
### Recovery Service (`application/services/recovery_service.py`)
|
||||
|
||||
| Method | Purpose |
|
||||
@@ -322,9 +422,37 @@ python -m grpc_tools.protoc -I src/noteflow/grpc/proto \
|
||||
| `recover_all()` | Orchestrate meeting + job recovery |
|
||||
| `RecoveryResult` | Dataclass with recovery counts |
|
||||
|
||||
### Webhook Service (`application/services/webhook_service.py`)
|
||||
|
||||
| Method | Purpose |
|
||||
|--------|---------|
|
||||
| `register_webhook()` | Register a webhook configuration |
|
||||
| `trigger_meeting_completed()` | Fire webhooks on meeting completion |
|
||||
| `trigger_summary_generated()` | Fire webhooks on summary generation |
|
||||
| `trigger_recording_started/stopped()` | Fire webhooks on recording lifecycle |
|
||||
|
||||
### Unit of Work Repositories
|
||||
|
||||
The `UnitOfWork` protocol provides access to all repositories:
|
||||
|
||||
| Property | Repository | Supports In-Memory |
|
||||
|----------|------------|-------------------|
|
||||
| `meetings` | `MeetingRepository` | Yes |
|
||||
| `segments` | `SegmentRepository` | Yes |
|
||||
| `summaries` | `SummaryRepository` | Yes |
|
||||
| `annotations` | `AnnotationRepository` | Yes |
|
||||
| `diarization_jobs` | `DiarizationJobRepository` | Yes |
|
||||
| `preferences` | `PreferencesRepository` | Yes |
|
||||
| `entities` | `EntityRepository` | Yes |
|
||||
| `integrations` | `IntegrationRepository` | DB only |
|
||||
| `webhooks` | `WebhookRepository` | Yes |
|
||||
|
||||
Check capability with `supports_*` properties (e.g., `uow.supports_webhooks`).
|
||||
|
||||
## Known Issues
|
||||
|
||||
See `docs/triage.md` for tracked technical debt.
|
||||
See `docs/sprints/` for feature implementation plans and status.
|
||||
|
||||
**Resolved:**
|
||||
- ~~Server-side state volatility~~ → Diarization jobs persisted to DB
|
||||
@@ -332,6 +460,8 @@ See `docs/triage.md` for tracked technical debt.
|
||||
- ~~Synchronous blocking in async gRPC~~ → `run_in_executor` for diarization
|
||||
- ~~Summarization consent not persisted~~ → Stored in `user_preferences` table
|
||||
- ~~VU meter update throttling~~ → 20fps throttle implemented
|
||||
- ~~Webhook infrastructure missing~~ → Full webhook subsystem with executor, service, and repository
|
||||
- ~~Integration/OAuth token storage~~ → `IntegrationSecretModel` for secure token storage
|
||||
|
||||
## MCP Tools Reference
|
||||
|
||||
|
||||
@@ -10,6 +10,7 @@ import logging
|
||||
from typing import TYPE_CHECKING
|
||||
from uuid import UUID
|
||||
|
||||
from noteflow.config.constants import ERR_TOKEN_REFRESH_PREFIX
|
||||
from noteflow.domain.entities.integration import Integration, IntegrationStatus, IntegrationType
|
||||
from noteflow.domain.ports.calendar import CalendarEventInfo, OAuthConnectionInfo
|
||||
from noteflow.domain.value_objects import OAuthProvider, OAuthTokens
|
||||
@@ -136,7 +137,7 @@ class CalendarService:
|
||||
|
||||
# Get user email from provider
|
||||
try:
|
||||
email = await self._get_user_email(oauth_provider, tokens.access_token)
|
||||
email = await self._fetch_account_email(oauth_provider, tokens.access_token)
|
||||
except (GoogleCalendarError, OutlookCalendarError) as e:
|
||||
raise CalendarServiceError(f"Failed to get user email: {e}") from e
|
||||
|
||||
@@ -189,7 +190,7 @@ class CalendarService:
|
||||
if integration is None:
|
||||
return OAuthConnectionInfo(
|
||||
provider=provider,
|
||||
status="disconnected",
|
||||
status=IntegrationStatus.DISCONNECTED.value,
|
||||
)
|
||||
|
||||
# Check token expiry
|
||||
@@ -204,7 +205,7 @@ class CalendarService:
|
||||
if tokens.is_expired():
|
||||
status = "expired"
|
||||
except (KeyError, ValueError):
|
||||
status = "error"
|
||||
status = IntegrationStatus.ERROR.value
|
||||
|
||||
return OAuthConnectionInfo(
|
||||
provider=provider,
|
||||
@@ -289,7 +290,7 @@ class CalendarService:
|
||||
events.extend(provider_events)
|
||||
else:
|
||||
# Fetch from all connected providers
|
||||
for p in ["google", "outlook"]:
|
||||
for p in [OAuthProvider.GOOGLE.value, OAuthProvider.OUTLOOK.value]:
|
||||
try:
|
||||
provider_events = await self._fetch_provider_events(
|
||||
provider=p,
|
||||
@@ -344,10 +345,10 @@ class CalendarService:
|
||||
)
|
||||
await uow.commit()
|
||||
except OAuthError as e:
|
||||
integration.mark_error(f"Token refresh failed: {e}")
|
||||
integration.mark_error(f"{ERR_TOKEN_REFRESH_PREFIX}{e}")
|
||||
await uow.integrations.update(integration)
|
||||
await uow.commit()
|
||||
raise CalendarServiceError(f"Token refresh failed: {e}") from e
|
||||
raise CalendarServiceError(f"{ERR_TOKEN_REFRESH_PREFIX}{e}") from e
|
||||
|
||||
# Fetch events
|
||||
try:
|
||||
@@ -382,12 +383,12 @@ class CalendarService:
|
||||
limit=limit,
|
||||
)
|
||||
|
||||
async def _get_user_email(
|
||||
async def _fetch_account_email(
|
||||
self,
|
||||
provider: OAuthProvider,
|
||||
access_token: str,
|
||||
) -> str:
|
||||
"""Get user email from provider API."""
|
||||
"""Fetch user email from provider API."""
|
||||
adapter = self._get_adapter(provider)
|
||||
return await adapter.get_user_email(access_token)
|
||||
|
||||
@@ -413,9 +414,4 @@ class CalendarService:
|
||||
@staticmethod
|
||||
def _map_integration_status(status: IntegrationStatus) -> str:
|
||||
"""Map IntegrationStatus to connection status string."""
|
||||
mapping = {
|
||||
IntegrationStatus.CONNECTED: "connected",
|
||||
IntegrationStatus.DISCONNECTED: "disconnected",
|
||||
IntegrationStatus.ERROR: "error",
|
||||
}
|
||||
return mapping.get(status, "disconnected")
|
||||
return status.value if status in IntegrationStatus else IntegrationStatus.DISCONNECTED.value
|
||||
|
||||
@@ -142,12 +142,14 @@ class ExportService:
|
||||
Raises:
|
||||
ValueError: If extension is not recognized.
|
||||
"""
|
||||
from noteflow.config.constants import EXPORT_EXT_HTML, EXPORT_EXT_PDF
|
||||
|
||||
extension_map = {
|
||||
".md": ExportFormat.MARKDOWN,
|
||||
".markdown": ExportFormat.MARKDOWN,
|
||||
".html": ExportFormat.HTML,
|
||||
EXPORT_EXT_HTML: ExportFormat.HTML,
|
||||
".htm": ExportFormat.HTML,
|
||||
".pdf": ExportFormat.PDF,
|
||||
EXPORT_EXT_PDF: ExportFormat.PDF,
|
||||
}
|
||||
fmt = extension_map.get(extension.lower())
|
||||
if fmt is None:
|
||||
|
||||
242
src/noteflow/application/services/webhook_service.py
Normal file
242
src/noteflow/application/services/webhook_service.py
Normal file
@@ -0,0 +1,242 @@
|
||||
"""Webhook application service for event notifications."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from noteflow.config.constants import DEFAULT_MEETING_TITLE
|
||||
from noteflow.domain.utils.time import utc_now
|
||||
from noteflow.domain.webhooks import (
|
||||
MeetingCompletedPayload,
|
||||
RecordingPayload,
|
||||
SummaryGeneratedPayload,
|
||||
WebhookConfig,
|
||||
WebhookDelivery,
|
||||
WebhookEventType,
|
||||
)
|
||||
from noteflow.infrastructure.webhooks import WebhookExecutor
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from noteflow.domain.entities.meeting import Meeting
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WebhookService:
|
||||
"""Orchestrate webhook delivery for meeting events.
|
||||
|
||||
Manages webhook configurations and coordinates delivery
|
||||
across all registered webhooks for each event type.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
executor: WebhookExecutor | None = None,
|
||||
) -> None:
|
||||
"""Initialize webhook service.
|
||||
|
||||
Args:
|
||||
executor: Webhook executor instance (created if not provided).
|
||||
"""
|
||||
self._executor = executor or WebhookExecutor()
|
||||
self._configs: list[WebhookConfig] = []
|
||||
|
||||
def register_webhook(self, config: WebhookConfig) -> None:
|
||||
"""Register a webhook configuration.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration to register.
|
||||
"""
|
||||
self._configs.append(config)
|
||||
_logger.info(
|
||||
"Registered webhook: %s for events: %s",
|
||||
config.url,
|
||||
[e.value for e in config.events],
|
||||
)
|
||||
|
||||
def unregister_webhook(self, webhook_id: str) -> bool:
|
||||
"""Unregister a webhook by ID.
|
||||
|
||||
Args:
|
||||
webhook_id: UUID of webhook to remove.
|
||||
|
||||
Returns:
|
||||
True if webhook was found and removed.
|
||||
"""
|
||||
initial_count = len(self._configs)
|
||||
self._configs = [c for c in self._configs if str(c.id) != webhook_id]
|
||||
removed = len(self._configs) < initial_count
|
||||
if removed:
|
||||
_logger.info("Unregistered webhook: %s", webhook_id)
|
||||
return removed
|
||||
|
||||
def get_webhooks(self) -> list[WebhookConfig]:
|
||||
"""Get all registered webhooks.
|
||||
|
||||
Returns:
|
||||
List of registered webhook configurations.
|
||||
"""
|
||||
return list(self._configs)
|
||||
|
||||
async def trigger_meeting_completed(
|
||||
self,
|
||||
meeting: Meeting,
|
||||
) -> list[WebhookDelivery]:
|
||||
"""Trigger webhooks for meeting completion.
|
||||
|
||||
Args:
|
||||
meeting: Completed meeting entity.
|
||||
|
||||
Returns:
|
||||
List of delivery records for all webhook attempts.
|
||||
"""
|
||||
payload = MeetingCompletedPayload(
|
||||
event=WebhookEventType.MEETING_COMPLETED.value,
|
||||
timestamp=utc_now().isoformat(),
|
||||
meeting_id=str(meeting.id),
|
||||
title=meeting.title or DEFAULT_MEETING_TITLE,
|
||||
duration_seconds=meeting.duration_seconds or 0.0,
|
||||
segment_count=len(meeting.segments),
|
||||
has_summary=meeting.summary is not None,
|
||||
)
|
||||
|
||||
return await self._deliver_to_all(
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload.to_dict(),
|
||||
)
|
||||
|
||||
async def trigger_summary_generated(
|
||||
self,
|
||||
meeting: Meeting,
|
||||
) -> list[WebhookDelivery]:
|
||||
"""Trigger webhooks for summary generation.
|
||||
|
||||
Args:
|
||||
meeting: Meeting with generated summary.
|
||||
|
||||
Returns:
|
||||
List of delivery records for all webhook attempts.
|
||||
"""
|
||||
summary = meeting.summary
|
||||
payload = SummaryGeneratedPayload(
|
||||
event=WebhookEventType.SUMMARY_GENERATED.value,
|
||||
timestamp=utc_now().isoformat(),
|
||||
meeting_id=str(meeting.id),
|
||||
title=meeting.title or DEFAULT_MEETING_TITLE,
|
||||
executive_summary=summary.executive_summary if summary else "",
|
||||
key_points_count=len(summary.key_points) if summary else 0,
|
||||
action_items_count=len(summary.action_items) if summary else 0,
|
||||
)
|
||||
|
||||
return await self._deliver_to_all(
|
||||
WebhookEventType.SUMMARY_GENERATED,
|
||||
payload.to_dict(),
|
||||
)
|
||||
|
||||
async def trigger_recording_started(
|
||||
self,
|
||||
meeting_id: str,
|
||||
title: str,
|
||||
) -> list[WebhookDelivery]:
|
||||
"""Trigger webhooks for recording start.
|
||||
|
||||
Args:
|
||||
meeting_id: ID of meeting being recorded.
|
||||
title: Meeting title.
|
||||
|
||||
Returns:
|
||||
List of delivery records.
|
||||
"""
|
||||
payload = RecordingPayload(
|
||||
event=WebhookEventType.RECORDING_STARTED.value,
|
||||
timestamp=utc_now().isoformat(),
|
||||
meeting_id=meeting_id,
|
||||
title=title,
|
||||
)
|
||||
|
||||
return await self._deliver_to_all(
|
||||
WebhookEventType.RECORDING_STARTED,
|
||||
payload.to_dict(),
|
||||
)
|
||||
|
||||
async def trigger_recording_stopped(
|
||||
self,
|
||||
meeting_id: str,
|
||||
title: str,
|
||||
duration_seconds: float,
|
||||
) -> list[WebhookDelivery]:
|
||||
"""Trigger webhooks for recording stop.
|
||||
|
||||
Args:
|
||||
meeting_id: ID of meeting.
|
||||
title: Meeting title.
|
||||
duration_seconds: Recording duration.
|
||||
|
||||
Returns:
|
||||
List of delivery records.
|
||||
"""
|
||||
payload = RecordingPayload(
|
||||
event=WebhookEventType.RECORDING_STOPPED.value,
|
||||
timestamp=utc_now().isoformat(),
|
||||
meeting_id=meeting_id,
|
||||
title=title,
|
||||
duration_seconds=duration_seconds,
|
||||
)
|
||||
|
||||
return await self._deliver_to_all(
|
||||
WebhookEventType.RECORDING_STOPPED,
|
||||
payload.to_dict(),
|
||||
)
|
||||
|
||||
async def _deliver_to_all(
|
||||
self,
|
||||
event_type: WebhookEventType,
|
||||
payload: dict[str, object],
|
||||
) -> list[WebhookDelivery]:
|
||||
"""Deliver event to all registered webhooks.
|
||||
|
||||
Args:
|
||||
event_type: Type of event.
|
||||
payload: Event payload.
|
||||
|
||||
Returns:
|
||||
List of delivery records.
|
||||
"""
|
||||
deliveries: list[WebhookDelivery] = []
|
||||
|
||||
for config in self._configs:
|
||||
try:
|
||||
delivery = await self._executor.deliver(config, event_type, payload)
|
||||
deliveries.append(delivery)
|
||||
|
||||
if delivery.succeeded:
|
||||
_logger.info(
|
||||
"Webhook delivered: %s -> %s (status=%d)",
|
||||
event_type.value,
|
||||
config.url,
|
||||
delivery.status_code,
|
||||
)
|
||||
elif delivery.attempt_count > 0:
|
||||
_logger.warning(
|
||||
"Webhook failed: %s -> %s (error=%s)",
|
||||
event_type.value,
|
||||
config.url,
|
||||
delivery.error_message,
|
||||
)
|
||||
else:
|
||||
_logger.debug(
|
||||
"Webhook skipped: %s -> %s (reason=%s)",
|
||||
event_type.value,
|
||||
config.url,
|
||||
delivery.error_message,
|
||||
)
|
||||
|
||||
except Exception:
|
||||
_logger.exception("Unexpected error delivering webhook to %s", config.url)
|
||||
|
||||
return deliveries
|
||||
|
||||
async def close(self) -> None:
|
||||
"""Clean up resources."""
|
||||
await self._executor.close()
|
||||
@@ -12,6 +12,8 @@ import subprocess
|
||||
import sys
|
||||
from dataclasses import dataclass, field
|
||||
|
||||
from noteflow.config.constants import SPACY_MODEL_LG, SPACY_MODEL_SM
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
@@ -21,6 +23,7 @@ logger = logging.getLogger(__name__)
|
||||
# Constants to avoid magic strings
|
||||
_DEFAULT_MODEL = "spacy-en"
|
||||
_LOG_DOWNLOAD_FAILED = "Failed to download %s: %s"
|
||||
_CMD_DOWNLOAD = "download"
|
||||
|
||||
|
||||
# Registry of available models with their download commands
|
||||
@@ -38,17 +41,17 @@ class ModelInfo:
|
||||
AVAILABLE_MODELS: dict[str, ModelInfo] = {
|
||||
_DEFAULT_MODEL: ModelInfo(
|
||||
name=_DEFAULT_MODEL,
|
||||
description="English NER model (en_core_web_sm)",
|
||||
description=f"English NER model ({SPACY_MODEL_SM})",
|
||||
feature="ner",
|
||||
install_command=["python", "-m", "spacy", "download", "en_core_web_sm"],
|
||||
check_import="en_core_web_sm",
|
||||
install_command=["python", "-m", "spacy", _CMD_DOWNLOAD, SPACY_MODEL_SM],
|
||||
check_import=SPACY_MODEL_SM,
|
||||
),
|
||||
"spacy-en-lg": ModelInfo(
|
||||
name="spacy-en-lg",
|
||||
description="English NER model - large (en_core_web_lg)",
|
||||
description=f"English NER model - large ({SPACY_MODEL_LG})",
|
||||
feature="ner",
|
||||
install_command=["python", "-m", "spacy", "download", "en_core_web_lg"],
|
||||
check_import="en_core_web_lg",
|
||||
install_command=["python", "-m", "spacy", _CMD_DOWNLOAD, SPACY_MODEL_LG],
|
||||
check_import=SPACY_MODEL_LG,
|
||||
),
|
||||
}
|
||||
|
||||
@@ -256,7 +259,7 @@ def main() -> None:
|
||||
subparsers = parser.add_subparsers(dest="command", help="Available commands")
|
||||
|
||||
# download command
|
||||
download_parser = subparsers.add_parser("download", help="Download ML models")
|
||||
download_parser = subparsers.add_parser(_CMD_DOWNLOAD, help="Download ML models")
|
||||
download_parser.add_argument(
|
||||
"--model",
|
||||
choices=list(AVAILABLE_MODELS.keys()),
|
||||
@@ -275,7 +278,7 @@ def main() -> None:
|
||||
parser.print_help()
|
||||
sys.exit(1)
|
||||
|
||||
if args.command == "download":
|
||||
if args.command == _CMD_DOWNLOAD:
|
||||
exit_code = _run_download(model_name=args.model)
|
||||
elif args.command == "list":
|
||||
exit_code = _list_models()
|
||||
|
||||
@@ -12,7 +12,7 @@ import sys
|
||||
|
||||
from noteflow.application.services import RetentionService
|
||||
from noteflow.config.settings import get_settings
|
||||
from noteflow.infrastructure.persistence.unit_of_work import SqlAlchemyUnitOfWork
|
||||
from noteflow.infrastructure.persistence.unit_of_work import create_uow_factory
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
@@ -38,7 +38,7 @@ async def _run_cleanup(dry_run: bool) -> int:
|
||||
)
|
||||
return 1
|
||||
|
||||
uow_factory = SqlAlchemyUnitOfWork.factory_from_settings(settings)
|
||||
uow_factory = create_uow_factory(settings)
|
||||
service = RetentionService(
|
||||
uow_factory=uow_factory,
|
||||
retention_days=settings.retention_days,
|
||||
@@ -76,7 +76,7 @@ async def _show_status() -> int:
|
||||
"""
|
||||
settings = get_settings()
|
||||
|
||||
uow_factory = SqlAlchemyUnitOfWork.factory_from_settings(settings)
|
||||
uow_factory = create_uow_factory(settings)
|
||||
service = RetentionService(
|
||||
uow_factory=uow_factory,
|
||||
retention_days=settings.retention_days,
|
||||
|
||||
@@ -26,3 +26,57 @@ AUDIO_BUFFER_SIZE_BYTES: Final[int] = 320_000
|
||||
|
||||
PERIODIC_FLUSH_INTERVAL_SECONDS: Final[float] = 2.0
|
||||
"""Interval for periodic audio buffer flush to disk (crash resilience)."""
|
||||
|
||||
# Meeting defaults
|
||||
DEFAULT_MEETING_TITLE: Final[str] = "Untitled"
|
||||
"""Default title for meetings without an explicit title."""
|
||||
|
||||
# Diarization constants
|
||||
ERR_HF_TOKEN_REQUIRED: Final[str] = "HuggingFace token required for pyannote models"
|
||||
"""Error message when HuggingFace token is missing for pyannote."""
|
||||
|
||||
ERR_SERVER_RESTARTED: Final[str] = "Server restarted"
|
||||
"""Error message for jobs interrupted by server restart."""
|
||||
|
||||
# Calendar/OAuth error messages
|
||||
ERR_TOKEN_EXPIRED: Final[str] = "Access token expired or invalid"
|
||||
"""Error for expired or invalid OAuth tokens."""
|
||||
|
||||
ERR_API_PREFIX: Final[str] = "API error: "
|
||||
"""Prefix for API error messages."""
|
||||
|
||||
ERR_TOKEN_REFRESH_PREFIX: Final[str] = "Token refresh failed: "
|
||||
"""Prefix for token refresh error messages."""
|
||||
|
||||
HTTP_AUTHORIZATION: Final[str] = "Authorization"
|
||||
"""HTTP Authorization header name."""
|
||||
|
||||
HTTP_BEARER_PREFIX: Final[str] = "Bearer "
|
||||
"""Standard HTTP authorization header prefix."""
|
||||
|
||||
# Application directory
|
||||
APP_DIR_NAME: Final[str] = ".noteflow"
|
||||
"""Application data directory name within user home."""
|
||||
|
||||
# spaCy NER model names
|
||||
SPACY_MODEL_SM: Final[str] = "en_core_web_sm"
|
||||
"""Small English spaCy model for NER."""
|
||||
|
||||
SPACY_MODEL_MD: Final[str] = "en_core_web_md"
|
||||
"""Medium English spaCy model for NER."""
|
||||
|
||||
SPACY_MODEL_LG: Final[str] = "en_core_web_lg"
|
||||
"""Large English spaCy model for NER."""
|
||||
|
||||
SPACY_MODEL_TRF: Final[str] = "en_core_web_trf"
|
||||
"""Transformer-based English spaCy model for NER."""
|
||||
|
||||
# Export format constants
|
||||
EXPORT_FORMAT_HTML: Final[str] = "HTML"
|
||||
"""HTML export format display name."""
|
||||
|
||||
EXPORT_EXT_HTML: Final[str] = ".html"
|
||||
"""HTML file extension."""
|
||||
|
||||
EXPORT_EXT_PDF: Final[str] = ".pdf"
|
||||
"""PDF file extension."""
|
||||
|
||||
@@ -8,10 +8,16 @@ from typing import Annotated
|
||||
from pydantic import Field, PostgresDsn, field_validator
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
from noteflow.config.constants import APP_DIR_NAME
|
||||
|
||||
# Shared settings configuration values
|
||||
_ENV_FILE = ".env"
|
||||
_EXTRA_IGNORE = "ignore"
|
||||
|
||||
|
||||
def _default_meetings_dir() -> Path:
|
||||
"""Return default meetings directory path."""
|
||||
return Path.home() / ".noteflow" / "meetings"
|
||||
return Path.home() / APP_DIR_NAME / "meetings"
|
||||
|
||||
|
||||
class TriggerSettings(BaseSettings):
|
||||
@@ -19,10 +25,10 @@ class TriggerSettings(BaseSettings):
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
env_prefix="NOTEFLOW_",
|
||||
env_file=".env",
|
||||
env_file=_ENV_FILE,
|
||||
env_file_encoding="utf-8",
|
||||
enable_decoding=False,
|
||||
extra="ignore",
|
||||
extra=_EXTRA_IGNORE,
|
||||
)
|
||||
|
||||
# Trigger settings (client-side)
|
||||
@@ -207,9 +213,9 @@ class FeatureFlags(BaseSettings):
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
env_prefix="NOTEFLOW_FEATURE_",
|
||||
env_file=".env",
|
||||
env_file=_ENV_FILE,
|
||||
env_file_encoding="utf-8",
|
||||
extra="ignore",
|
||||
extra=_EXTRA_IGNORE,
|
||||
)
|
||||
|
||||
templates_enabled: Annotated[
|
||||
@@ -250,9 +256,9 @@ class CalendarSettings(BaseSettings):
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
env_prefix="NOTEFLOW_CALENDAR_",
|
||||
env_file=".env",
|
||||
env_file=_ENV_FILE,
|
||||
env_file_encoding="utf-8",
|
||||
extra="ignore",
|
||||
extra=_EXTRA_IGNORE,
|
||||
)
|
||||
|
||||
# Google OAuth
|
||||
@@ -403,6 +409,78 @@ class Settings(TriggerSettings):
|
||||
bool,
|
||||
Field(default=True, description="Enable post-meeting diarization refinement"),
|
||||
]
|
||||
diarization_job_ttl_hours: Annotated[
|
||||
int,
|
||||
Field(default=1, ge=1, le=168, description="Hours to retain diarization job records"),
|
||||
]
|
||||
|
||||
# gRPC streaming settings
|
||||
grpc_max_chunk_size_mb: Annotated[
|
||||
int,
|
||||
Field(default=1, ge=1, le=100, description="Maximum gRPC chunk size in MB"),
|
||||
]
|
||||
grpc_chunk_timeout_seconds: Annotated[
|
||||
float,
|
||||
Field(default=0.1, ge=0.01, le=10.0, description="Timeout for receiving audio chunks"),
|
||||
]
|
||||
grpc_queue_max_size: Annotated[
|
||||
int,
|
||||
Field(default=1000, ge=100, le=10000, description="Maximum audio queue size"),
|
||||
]
|
||||
grpc_partial_cadence_seconds: Annotated[
|
||||
float,
|
||||
Field(default=2.0, ge=0.5, le=10.0, description="Interval for emitting partial transcripts"),
|
||||
]
|
||||
grpc_min_partial_audio_seconds: Annotated[
|
||||
float,
|
||||
Field(default=0.5, ge=0.1, le=5.0, description="Minimum audio for partial inference"),
|
||||
]
|
||||
|
||||
# Webhook settings
|
||||
webhook_timeout_seconds: Annotated[
|
||||
float,
|
||||
Field(default=10.0, ge=1.0, le=60.0, description="Webhook HTTP request timeout"),
|
||||
]
|
||||
webhook_max_retries: Annotated[
|
||||
int,
|
||||
Field(default=3, ge=0, le=10, description="Maximum webhook delivery attempts"),
|
||||
]
|
||||
webhook_backoff_base: Annotated[
|
||||
float,
|
||||
Field(default=2.0, ge=1.1, le=5.0, description="Exponential backoff multiplier for webhook retries"),
|
||||
]
|
||||
webhook_max_response_length: Annotated[
|
||||
int,
|
||||
Field(default=500, ge=100, le=10000, description="Maximum response body length to log"),
|
||||
]
|
||||
|
||||
# LLM/Summarization settings
|
||||
llm_temperature: Annotated[
|
||||
float,
|
||||
Field(default=0.3, ge=0.0, le=2.0, description="Temperature for LLM inference"),
|
||||
]
|
||||
llm_default_openai_model: Annotated[
|
||||
str,
|
||||
Field(default="gpt-4o-mini", description="Default OpenAI model for summarization"),
|
||||
]
|
||||
llm_default_anthropic_model: Annotated[
|
||||
str,
|
||||
Field(default="claude-3-haiku-20240307", description="Default Anthropic model for summarization"),
|
||||
]
|
||||
llm_timeout_seconds: Annotated[
|
||||
float,
|
||||
Field(default=60.0, ge=10.0, le=300.0, description="Timeout for LLM requests"),
|
||||
]
|
||||
|
||||
# Ollama settings
|
||||
ollama_host: Annotated[
|
||||
str,
|
||||
Field(default="http://localhost:11434", description="Ollama server host URL"),
|
||||
]
|
||||
ollama_timeout_seconds: Annotated[
|
||||
float,
|
||||
Field(default=120.0, ge=10.0, le=600.0, description="Timeout for Ollama requests"),
|
||||
]
|
||||
|
||||
@property
|
||||
def database_url_str(self) -> str:
|
||||
|
||||
@@ -26,7 +26,7 @@ class KeyPoint:
|
||||
# Database primary key (set after persistence)
|
||||
db_id: int | None = None
|
||||
|
||||
def has_evidence(self) -> bool:
|
||||
def is_sourced(self) -> bool:
|
||||
"""Check if key point is backed by transcript evidence."""
|
||||
return len(self.segment_ids) > 0
|
||||
|
||||
@@ -55,8 +55,9 @@ class ActionItem:
|
||||
return len(self.segment_ids) > 0
|
||||
|
||||
def is_assigned(self) -> bool:
|
||||
"""Check if action item has an assignee."""
|
||||
return bool(self.assignee)
|
||||
"""Check if action item has a non-empty assignee."""
|
||||
assignee = self.assignee
|
||||
return bool(assignee and assignee.strip())
|
||||
|
||||
def has_due_date(self) -> bool:
|
||||
"""Check if action item has a due date."""
|
||||
@@ -91,11 +92,17 @@ class Summary:
|
||||
|
||||
def all_points_have_evidence(self) -> bool:
|
||||
"""Check if all key points have transcript evidence."""
|
||||
return all(kp.has_evidence() for kp in self.key_points)
|
||||
points = self.key_points
|
||||
if not points:
|
||||
return True
|
||||
return all(kp.is_sourced() for kp in points)
|
||||
|
||||
def all_actions_have_evidence(self) -> bool:
|
||||
"""Check if all action items have transcript evidence."""
|
||||
return all(ai.has_evidence() for ai in self.action_items)
|
||||
actions = self.action_items
|
||||
if not actions:
|
||||
return True
|
||||
return all(ai.has_evidence() for ai in actions)
|
||||
|
||||
def is_fully_evidenced(self) -> bool:
|
||||
"""Check if entire summary is backed by transcript evidence."""
|
||||
@@ -103,18 +110,20 @@ class Summary:
|
||||
|
||||
@property
|
||||
def key_point_count(self) -> int:
|
||||
"""Number of key points."""
|
||||
return len(self.key_points)
|
||||
"""Number of key points in summary."""
|
||||
points = self.key_points
|
||||
return len(points)
|
||||
|
||||
@property
|
||||
def action_item_count(self) -> int:
|
||||
"""Number of action items."""
|
||||
return len(self.action_items)
|
||||
"""Number of action items in summary."""
|
||||
actions = self.action_items
|
||||
return len(actions)
|
||||
|
||||
@property
|
||||
def unevidenced_points(self) -> list[KeyPoint]:
|
||||
"""Key points without transcript evidence."""
|
||||
return [kp for kp in self.key_points if not kp.has_evidence()]
|
||||
return [kp for kp in self.key_points if not kp.is_sourced()]
|
||||
|
||||
@property
|
||||
def unevidenced_actions(self) -> list[ActionItem]:
|
||||
|
||||
@@ -19,6 +19,7 @@ from .repositories import (
|
||||
MeetingRepository,
|
||||
SegmentRepository,
|
||||
SummaryRepository,
|
||||
WebhookRepository,
|
||||
)
|
||||
from .unit_of_work import UnitOfWork
|
||||
|
||||
@@ -38,4 +39,5 @@ __all__ = [
|
||||
"SegmentRepository",
|
||||
"SummaryRepository",
|
||||
"UnitOfWork",
|
||||
"WebhookRepository",
|
||||
]
|
||||
|
||||
@@ -6,12 +6,15 @@ from collections.abc import Sequence
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING, Protocol
|
||||
|
||||
from noteflow.config.constants import ERR_SERVER_RESTARTED
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from uuid import UUID
|
||||
|
||||
from noteflow.domain.entities import Annotation, Integration, Meeting, Segment, Summary
|
||||
from noteflow.domain.entities.named_entity import NamedEntity
|
||||
from noteflow.domain.value_objects import AnnotationId, MeetingId, MeetingState
|
||||
from noteflow.domain.webhooks import WebhookConfig, WebhookDelivery
|
||||
from noteflow.infrastructure.persistence.repositories import DiarizationJob, StreamingTurn
|
||||
|
||||
|
||||
@@ -223,25 +226,11 @@ class SummaryRepository(Protocol):
|
||||
...
|
||||
|
||||
async def get_by_meeting(self, meeting_id: MeetingId) -> Summary | None:
|
||||
"""Get summary for a meeting.
|
||||
|
||||
Args:
|
||||
meeting_id: Meeting identifier.
|
||||
|
||||
Returns:
|
||||
Summary if exists, None otherwise.
|
||||
"""
|
||||
"""Get summary for a meeting, or None if not found."""
|
||||
...
|
||||
|
||||
async def delete_by_meeting(self, meeting_id: MeetingId) -> bool:
|
||||
"""Delete summary for a meeting.
|
||||
|
||||
Args:
|
||||
meeting_id: Meeting identifier.
|
||||
|
||||
Returns:
|
||||
True if deleted, False if not found.
|
||||
"""
|
||||
"""Delete summary for a meeting. Returns True if deleted."""
|
||||
...
|
||||
|
||||
|
||||
@@ -263,28 +252,11 @@ class AnnotationRepository(Protocol):
|
||||
...
|
||||
|
||||
async def get(self, annotation_id: AnnotationId) -> Annotation | None:
|
||||
"""Retrieve an annotation by ID.
|
||||
|
||||
Args:
|
||||
annotation_id: Annotation identifier.
|
||||
|
||||
Returns:
|
||||
Annotation if found, None otherwise.
|
||||
"""
|
||||
"""Retrieve an annotation by ID, or None if not found."""
|
||||
...
|
||||
|
||||
async def get_by_meeting(
|
||||
self,
|
||||
meeting_id: MeetingId,
|
||||
) -> Sequence[Annotation]:
|
||||
"""Get all annotations for a meeting.
|
||||
|
||||
Args:
|
||||
meeting_id: Meeting identifier.
|
||||
|
||||
Returns:
|
||||
List of annotations ordered by start_time.
|
||||
"""
|
||||
async def get_by_meeting(self, meeting_id: MeetingId) -> Sequence[Annotation]:
|
||||
"""Get all annotations for a meeting, ordered by start_time."""
|
||||
...
|
||||
|
||||
async def get_by_time_range(
|
||||
@@ -434,7 +406,7 @@ class DiarizationJobRepository(Protocol):
|
||||
"""
|
||||
...
|
||||
|
||||
async def mark_running_as_failed(self, error_message: str = "Server restarted") -> int:
|
||||
async def mark_running_as_failed(self, error_message: str = ERR_SERVER_RESTARTED) -> int:
|
||||
"""Mark queued/running jobs as failed.
|
||||
|
||||
Args:
|
||||
@@ -676,3 +648,45 @@ class IntegrationRepository(Protocol):
|
||||
List of integrations of the specified type.
|
||||
"""
|
||||
...
|
||||
|
||||
|
||||
class WebhookRepository(Protocol):
|
||||
"""Repository for webhook configuration and delivery operations."""
|
||||
|
||||
async def get_all_enabled(
|
||||
self, workspace_id: UUID | None = None,
|
||||
) -> Sequence[WebhookConfig]:
|
||||
"""Return all enabled webhooks, optionally filtered by workspace."""
|
||||
...
|
||||
|
||||
async def get_all(
|
||||
self, workspace_id: UUID | None = None,
|
||||
) -> Sequence[WebhookConfig]:
|
||||
"""Return all webhooks regardless of enabled status."""
|
||||
...
|
||||
|
||||
async def get_by_id(self, webhook_id: UUID) -> WebhookConfig | None:
|
||||
"""Return webhook by ID or None if not found."""
|
||||
...
|
||||
|
||||
async def create(self, config: WebhookConfig) -> WebhookConfig:
|
||||
"""Persist a new webhook configuration."""
|
||||
...
|
||||
|
||||
async def update(self, config: WebhookConfig) -> WebhookConfig:
|
||||
"""Update existing webhook. Raises ValueError if not found."""
|
||||
...
|
||||
|
||||
async def delete(self, webhook_id: UUID) -> bool:
|
||||
"""Delete webhook by ID. Return True if deleted, False if not found."""
|
||||
...
|
||||
|
||||
async def add_delivery(self, delivery: WebhookDelivery) -> WebhookDelivery:
|
||||
"""Record a webhook delivery attempt."""
|
||||
...
|
||||
|
||||
async def get_deliveries(
|
||||
self, webhook_id: UUID, limit: int = 50,
|
||||
) -> Sequence[WebhookDelivery]:
|
||||
"""Return delivery history for webhook, newest first."""
|
||||
...
|
||||
|
||||
@@ -14,6 +14,7 @@ if TYPE_CHECKING:
|
||||
PreferencesRepository,
|
||||
SegmentRepository,
|
||||
SummaryRepository,
|
||||
WebhookRepository,
|
||||
)
|
||||
|
||||
|
||||
@@ -77,6 +78,11 @@ class UnitOfWork(Protocol):
|
||||
"""Access the integrations repository for OAuth connections."""
|
||||
...
|
||||
|
||||
@property
|
||||
def webhooks(self) -> WebhookRepository:
|
||||
"""Access the webhooks repository for event notifications."""
|
||||
...
|
||||
|
||||
# Feature flags for DB-only capabilities
|
||||
@property
|
||||
def supports_annotations(self) -> bool:
|
||||
@@ -119,6 +125,14 @@ class UnitOfWork(Protocol):
|
||||
"""
|
||||
...
|
||||
|
||||
@property
|
||||
def supports_webhooks(self) -> bool:
|
||||
"""Check if webhook persistence is supported.
|
||||
|
||||
Returns False for memory-only implementations.
|
||||
"""
|
||||
...
|
||||
|
||||
async def __aenter__(self) -> Self:
|
||||
"""Enter the unit of work context.
|
||||
|
||||
|
||||
@@ -28,7 +28,8 @@ class SummarizationRequest:
|
||||
@property
|
||||
def transcript_text(self) -> str:
|
||||
"""Concatenate all segment text into a single transcript."""
|
||||
return " ".join(seg.text for seg in self.segments)
|
||||
segments = self.segments
|
||||
return " ".join(seg.text for seg in segments)
|
||||
|
||||
@property
|
||||
def segment_count(self) -> int:
|
||||
|
||||
@@ -18,4 +18,5 @@ def utc_now() -> datetime:
|
||||
Returns:
|
||||
Current datetime in UTC timezone with microsecond precision.
|
||||
"""
|
||||
return datetime.now(UTC)
|
||||
now = datetime.now(UTC)
|
||||
return now
|
||||
|
||||
@@ -108,8 +108,8 @@ class OAuthState:
|
||||
created_at: datetime
|
||||
expires_at: datetime
|
||||
|
||||
def is_expired(self) -> bool:
|
||||
"""Check if the state has expired."""
|
||||
def is_state_expired(self) -> bool:
|
||||
"""Check if the OAuth state has expired."""
|
||||
return datetime.now(self.created_at.tzinfo) > self.expires_at
|
||||
|
||||
|
||||
|
||||
21
src/noteflow/domain/webhooks/__init__.py
Normal file
21
src/noteflow/domain/webhooks/__init__.py
Normal file
@@ -0,0 +1,21 @@
|
||||
"""Webhook domain module for event notification system."""
|
||||
|
||||
from .events import (
|
||||
MeetingCompletedPayload,
|
||||
RecordingPayload,
|
||||
SummaryGeneratedPayload,
|
||||
WebhookConfig,
|
||||
WebhookDelivery,
|
||||
WebhookEventType,
|
||||
WebhookPayload,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"MeetingCompletedPayload",
|
||||
"RecordingPayload",
|
||||
"SummaryGeneratedPayload",
|
||||
"WebhookConfig",
|
||||
"WebhookDelivery",
|
||||
"WebhookEventType",
|
||||
"WebhookPayload",
|
||||
]
|
||||
311
src/noteflow/domain/webhooks/events.py
Normal file
311
src/noteflow/domain/webhooks/events.py
Normal file
@@ -0,0 +1,311 @@
|
||||
"""Webhook event types and domain entities.
|
||||
|
||||
Domain entities match the ORM models in
|
||||
infrastructure/persistence/models/integrations/webhook.py for seamless conversion.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime
|
||||
from enum import Enum
|
||||
from typing import Any
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
from noteflow.domain.utils.time import utc_now
|
||||
|
||||
|
||||
class WebhookEventType(Enum):
|
||||
"""Types of webhook trigger events."""
|
||||
|
||||
MEETING_COMPLETED = "meeting.completed"
|
||||
SUMMARY_GENERATED = "summary.generated"
|
||||
RECORDING_STARTED = "recording.started"
|
||||
RECORDING_STOPPED = "recording.stopped"
|
||||
|
||||
|
||||
@dataclass(frozen=True, slots=True)
|
||||
class WebhookConfig:
|
||||
"""Webhook configuration for event delivery.
|
||||
|
||||
Fields match WebhookConfigModel ORM for seamless conversion.
|
||||
|
||||
Attributes:
|
||||
id: Unique webhook identifier.
|
||||
workspace_id: Workspace this webhook belongs to.
|
||||
url: Target URL for webhook delivery.
|
||||
events: Set of event types this webhook is subscribed to.
|
||||
name: Display name for the webhook.
|
||||
secret: Optional HMAC signing secret.
|
||||
enabled: Whether the webhook is active.
|
||||
timeout_ms: HTTP request timeout in milliseconds.
|
||||
max_retries: Maximum delivery retry attempts.
|
||||
created_at: When the webhook was created.
|
||||
updated_at: When the webhook was last modified.
|
||||
"""
|
||||
|
||||
id: UUID
|
||||
workspace_id: UUID
|
||||
url: str
|
||||
events: frozenset[WebhookEventType]
|
||||
name: str = "Webhook"
|
||||
secret: str | None = None
|
||||
enabled: bool = True
|
||||
timeout_ms: int = 10000
|
||||
max_retries: int = 3
|
||||
created_at: datetime = field(default_factory=utc_now)
|
||||
updated_at: datetime = field(default_factory=utc_now)
|
||||
|
||||
@classmethod
|
||||
def create(
|
||||
cls,
|
||||
workspace_id: UUID,
|
||||
url: str,
|
||||
events: list[WebhookEventType],
|
||||
*,
|
||||
name: str = "Webhook",
|
||||
secret: str | None = None,
|
||||
timeout_ms: int = 10000,
|
||||
max_retries: int = 3,
|
||||
) -> WebhookConfig:
|
||||
"""Create a new webhook configuration.
|
||||
|
||||
Args:
|
||||
workspace_id: Workspace UUID.
|
||||
url: Target URL for delivery.
|
||||
events: List of event types to subscribe.
|
||||
name: Display name.
|
||||
secret: Optional HMAC signing secret.
|
||||
timeout_ms: Request timeout in milliseconds.
|
||||
max_retries: Maximum retry attempts.
|
||||
|
||||
Returns:
|
||||
New WebhookConfig with generated ID and timestamps.
|
||||
"""
|
||||
now = utc_now()
|
||||
return cls(
|
||||
id=uuid4(),
|
||||
workspace_id=workspace_id,
|
||||
url=url,
|
||||
events=frozenset(events),
|
||||
name=name,
|
||||
secret=secret,
|
||||
timeout_ms=timeout_ms,
|
||||
max_retries=max_retries,
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
)
|
||||
|
||||
def subscribes_to(self, event_type: WebhookEventType) -> bool:
|
||||
"""Check if this webhook subscribes to the given event type.
|
||||
|
||||
Args:
|
||||
event_type: Event type to check.
|
||||
|
||||
Returns:
|
||||
True if subscribed to this event.
|
||||
"""
|
||||
return event_type in self.events
|
||||
|
||||
|
||||
@dataclass(frozen=True, slots=True)
|
||||
class WebhookDelivery:
|
||||
"""Record of a webhook delivery attempt.
|
||||
|
||||
Fields match WebhookDeliveryModel ORM for seamless conversion.
|
||||
|
||||
Attributes:
|
||||
id: Unique delivery identifier.
|
||||
webhook_id: Associated webhook config ID.
|
||||
event_type: Type of event that triggered delivery.
|
||||
payload: Event payload that was sent.
|
||||
status_code: HTTP response status code (None if request failed).
|
||||
response_body: Response body (truncated if large).
|
||||
error_message: Error description if delivery failed.
|
||||
attempt_count: Number of delivery attempts made.
|
||||
duration_ms: Request duration in milliseconds.
|
||||
delivered_at: When the delivery was attempted.
|
||||
"""
|
||||
|
||||
id: UUID
|
||||
webhook_id: UUID
|
||||
event_type: WebhookEventType
|
||||
payload: dict[str, Any]
|
||||
status_code: int | None
|
||||
response_body: str | None
|
||||
error_message: str | None
|
||||
attempt_count: int
|
||||
duration_ms: int | None
|
||||
delivered_at: datetime
|
||||
|
||||
@classmethod
|
||||
def create(
|
||||
cls,
|
||||
webhook_id: UUID,
|
||||
event_type: WebhookEventType,
|
||||
payload: dict[str, Any],
|
||||
*,
|
||||
status_code: int | None = None,
|
||||
response_body: str | None = None,
|
||||
error_message: str | None = None,
|
||||
attempt_count: int = 1,
|
||||
duration_ms: int | None = None,
|
||||
) -> WebhookDelivery:
|
||||
"""Create a new delivery record.
|
||||
|
||||
Args:
|
||||
webhook_id: Associated webhook config ID.
|
||||
event_type: Type of event.
|
||||
payload: Event payload.
|
||||
status_code: HTTP response status.
|
||||
response_body: Response body.
|
||||
error_message: Error description.
|
||||
attempt_count: Number of attempts.
|
||||
duration_ms: Request duration.
|
||||
|
||||
Returns:
|
||||
New WebhookDelivery with generated ID and timestamp.
|
||||
"""
|
||||
return cls(
|
||||
id=uuid4(),
|
||||
webhook_id=webhook_id,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=status_code,
|
||||
response_body=response_body,
|
||||
error_message=error_message,
|
||||
attempt_count=attempt_count,
|
||||
duration_ms=duration_ms,
|
||||
delivered_at=utc_now(),
|
||||
)
|
||||
|
||||
@property
|
||||
def succeeded(self) -> bool:
|
||||
"""Check if delivery was successful.
|
||||
|
||||
Returns:
|
||||
True if status code indicates success (2xx).
|
||||
"""
|
||||
return self.status_code is not None and 200 <= self.status_code < 300
|
||||
|
||||
|
||||
@dataclass(frozen=True, slots=True)
|
||||
class WebhookPayload:
|
||||
"""Base webhook event payload.
|
||||
|
||||
Attributes:
|
||||
event: Event type identifier string.
|
||||
timestamp: ISO 8601 formatted event timestamp.
|
||||
meeting_id: Associated meeting UUID as string.
|
||||
"""
|
||||
|
||||
event: str
|
||||
timestamp: str
|
||||
meeting_id: str
|
||||
|
||||
def to_dict(self) -> dict[str, Any]:
|
||||
"""Convert to dictionary for JSON serialization.
|
||||
|
||||
Returns:
|
||||
Dictionary representation of the payload.
|
||||
"""
|
||||
return {
|
||||
"event": self.event,
|
||||
"timestamp": self.timestamp,
|
||||
"meeting_id": self.meeting_id,
|
||||
}
|
||||
|
||||
|
||||
@dataclass(frozen=True, slots=True)
|
||||
class MeetingCompletedPayload(WebhookPayload):
|
||||
"""Payload for meeting.completed event.
|
||||
|
||||
Attributes:
|
||||
title: Meeting title.
|
||||
duration_seconds: Total meeting duration.
|
||||
segment_count: Number of transcript segments.
|
||||
has_summary: Whether a summary exists.
|
||||
"""
|
||||
|
||||
title: str
|
||||
duration_seconds: float
|
||||
segment_count: int
|
||||
has_summary: bool
|
||||
|
||||
def to_dict(self) -> dict[str, Any]:
|
||||
"""Convert to dictionary for JSON serialization.
|
||||
|
||||
Returns:
|
||||
Dictionary representation including meeting details.
|
||||
"""
|
||||
return {
|
||||
"event": self.event,
|
||||
"timestamp": self.timestamp,
|
||||
"meeting_id": self.meeting_id,
|
||||
"title": self.title,
|
||||
"duration_seconds": self.duration_seconds,
|
||||
"segment_count": self.segment_count,
|
||||
"has_summary": self.has_summary,
|
||||
}
|
||||
|
||||
|
||||
@dataclass(frozen=True, slots=True)
|
||||
class SummaryGeneratedPayload(WebhookPayload):
|
||||
"""Payload for summary.generated event.
|
||||
|
||||
Attributes:
|
||||
title: Meeting title.
|
||||
executive_summary: Summary executive overview text.
|
||||
key_points_count: Number of key points in summary.
|
||||
action_items_count: Number of action items in summary.
|
||||
"""
|
||||
|
||||
title: str
|
||||
executive_summary: str
|
||||
key_points_count: int
|
||||
action_items_count: int
|
||||
|
||||
def to_dict(self) -> dict[str, Any]:
|
||||
"""Convert to dictionary for JSON serialization.
|
||||
|
||||
Returns:
|
||||
Dictionary representation including summary details.
|
||||
"""
|
||||
return {
|
||||
"event": self.event,
|
||||
"timestamp": self.timestamp,
|
||||
"meeting_id": self.meeting_id,
|
||||
"title": self.title,
|
||||
"executive_summary": self.executive_summary,
|
||||
"key_points_count": self.key_points_count,
|
||||
"action_items_count": self.action_items_count,
|
||||
}
|
||||
|
||||
|
||||
@dataclass(frozen=True, slots=True)
|
||||
class RecordingPayload(WebhookPayload):
|
||||
"""Payload for recording.started and recording.stopped events.
|
||||
|
||||
Attributes:
|
||||
title: Meeting title.
|
||||
duration_seconds: Recording duration (only for stopped events).
|
||||
"""
|
||||
|
||||
title: str
|
||||
duration_seconds: float | None = None
|
||||
|
||||
def to_dict(self) -> dict[str, Any]:
|
||||
"""Convert to dictionary for JSON serialization.
|
||||
|
||||
Returns:
|
||||
Dictionary representation including recording details.
|
||||
"""
|
||||
result: dict[str, Any] = {
|
||||
"event": self.event,
|
||||
"timestamp": self.timestamp,
|
||||
"meeting_id": self.meeting_id,
|
||||
"title": self.title,
|
||||
}
|
||||
if self.duration_seconds is not None:
|
||||
result["duration_seconds"] = self.duration_seconds
|
||||
return result
|
||||
@@ -9,6 +9,7 @@ from .export import ExportMixin
|
||||
from .meeting import MeetingMixin
|
||||
from .streaming import StreamingMixin
|
||||
from .summarization import SummarizationMixin
|
||||
from .webhooks import WebhooksMixin
|
||||
|
||||
__all__ = [
|
||||
"AnnotationMixin",
|
||||
@@ -20,4 +21,5 @@ __all__ = [
|
||||
"MeetingMixin",
|
||||
"StreamingMixin",
|
||||
"SummarizationMixin",
|
||||
"WebhooksMixin",
|
||||
]
|
||||
|
||||
@@ -22,6 +22,10 @@ from .errors import abort_database_required, abort_invalid_argument, abort_not_f
|
||||
if TYPE_CHECKING:
|
||||
from .protocols import ServicerHost
|
||||
|
||||
# Entity type names for error messages
|
||||
_ENTITY_ANNOTATION = "Annotation"
|
||||
_ENTITY_ANNOTATIONS = "Annotations"
|
||||
|
||||
|
||||
class AnnotationMixin:
|
||||
"""Mixin providing annotation CRUD functionality.
|
||||
@@ -38,7 +42,7 @@ class AnnotationMixin:
|
||||
"""Add an annotation to a meeting."""
|
||||
async with self._create_repository_provider() as repo:
|
||||
if not repo.supports_annotations:
|
||||
await abort_database_required(context, "Annotations")
|
||||
await abort_database_required(context, _ENTITY_ANNOTATIONS)
|
||||
|
||||
meeting_id = await parse_meeting_id_or_abort(request.meeting_id, context)
|
||||
annotation_type = proto_to_annotation_type(request.annotation_type)
|
||||
@@ -64,7 +68,7 @@ class AnnotationMixin:
|
||||
"""Get an annotation by ID."""
|
||||
async with self._create_repository_provider() as repo:
|
||||
if not repo.supports_annotations:
|
||||
await abort_database_required(context, "Annotations")
|
||||
await abort_database_required(context, _ENTITY_ANNOTATIONS)
|
||||
|
||||
try:
|
||||
annotation_id = parse_annotation_id(request.annotation_id)
|
||||
@@ -73,7 +77,7 @@ class AnnotationMixin:
|
||||
|
||||
annotation = await repo.annotations.get(annotation_id)
|
||||
if annotation is None:
|
||||
await abort_not_found(context, "Annotation", request.annotation_id)
|
||||
await abort_not_found(context, _ENTITY_ANNOTATION, request.annotation_id)
|
||||
return annotation_to_proto(annotation)
|
||||
|
||||
async def ListAnnotations(
|
||||
@@ -84,7 +88,7 @@ class AnnotationMixin:
|
||||
"""List annotations for a meeting."""
|
||||
async with self._create_repository_provider() as repo:
|
||||
if not repo.supports_annotations:
|
||||
await abort_database_required(context, "Annotations")
|
||||
await abort_database_required(context, _ENTITY_ANNOTATIONS)
|
||||
|
||||
meeting_id = await parse_meeting_id_or_abort(request.meeting_id, context)
|
||||
# Check if time range filter is specified
|
||||
@@ -109,7 +113,7 @@ class AnnotationMixin:
|
||||
"""Update an existing annotation."""
|
||||
async with self._create_repository_provider() as repo:
|
||||
if not repo.supports_annotations:
|
||||
await abort_database_required(context, "Annotations")
|
||||
await abort_database_required(context, _ENTITY_ANNOTATIONS)
|
||||
|
||||
try:
|
||||
annotation_id = parse_annotation_id(request.annotation_id)
|
||||
@@ -118,7 +122,7 @@ class AnnotationMixin:
|
||||
|
||||
annotation = await repo.annotations.get(annotation_id)
|
||||
if annotation is None:
|
||||
await abort_not_found(context, "Annotation", request.annotation_id)
|
||||
await abort_not_found(context, _ENTITY_ANNOTATION, request.annotation_id)
|
||||
|
||||
# Update fields if provided
|
||||
if request.annotation_type != noteflow_pb2.ANNOTATION_TYPE_UNSPECIFIED:
|
||||
@@ -144,7 +148,7 @@ class AnnotationMixin:
|
||||
"""Delete an annotation."""
|
||||
async with self._create_repository_provider() as repo:
|
||||
if not repo.supports_annotations:
|
||||
await abort_database_required(context, "Annotations")
|
||||
await abort_database_required(context, _ENTITY_ANNOTATIONS)
|
||||
|
||||
try:
|
||||
annotation_id = parse_annotation_id(request.annotation_id)
|
||||
@@ -155,4 +159,4 @@ class AnnotationMixin:
|
||||
if success:
|
||||
await repo.commit()
|
||||
return noteflow_pb2.DeleteAnnotationResponse(success=True)
|
||||
await abort_not_found(context, "Annotation", request.annotation_id)
|
||||
await abort_not_found(context, _ENTITY_ANNOTATION, request.annotation_id)
|
||||
|
||||
@@ -7,14 +7,35 @@ from typing import TYPE_CHECKING
|
||||
import grpc.aio
|
||||
|
||||
from noteflow.application.services.calendar_service import CalendarServiceError
|
||||
from noteflow.domain.entities.integration import IntegrationStatus
|
||||
from noteflow.domain.value_objects import OAuthProvider
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .errors import abort_internal, abort_invalid_argument, abort_unavailable
|
||||
|
||||
_ERR_CALENDAR_NOT_ENABLED = "Calendar integration not enabled"
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from noteflow.domain.ports.calendar import OAuthConnectionInfo
|
||||
|
||||
from .protocols import ServicerHost
|
||||
|
||||
|
||||
def _build_oauth_connection(
|
||||
info: OAuthConnectionInfo,
|
||||
integration_type: str,
|
||||
) -> noteflow_pb2.OAuthConnection:
|
||||
"""Build OAuthConnection proto from connection info."""
|
||||
return noteflow_pb2.OAuthConnection(
|
||||
provider=info.provider,
|
||||
status=info.status,
|
||||
email=info.email or "",
|
||||
expires_at=int(info.expires_at.timestamp()) if info.expires_at else 0,
|
||||
error_message=info.error_message or "",
|
||||
integration_type=integration_type,
|
||||
)
|
||||
|
||||
|
||||
class CalendarMixin:
|
||||
"""Mixin providing calendar integration functionality.
|
||||
|
||||
@@ -29,7 +50,7 @@ class CalendarMixin:
|
||||
) -> noteflow_pb2.ListCalendarEventsResponse:
|
||||
"""List upcoming calendar events from connected providers."""
|
||||
if self._calendar_service is None:
|
||||
await abort_unavailable(context, "Calendar integration not enabled")
|
||||
await abort_unavailable(context, _ERR_CALENDAR_NOT_ENABLED)
|
||||
|
||||
provider = request.provider if request.provider else None
|
||||
hours_ahead = request.hours_ahead if request.hours_ahead > 0 else None
|
||||
@@ -71,18 +92,18 @@ class CalendarMixin:
|
||||
) -> noteflow_pb2.GetCalendarProvidersResponse:
|
||||
"""Get available calendar providers with authentication status."""
|
||||
if self._calendar_service is None:
|
||||
await abort_unavailable(context, "Calendar integration not enabled")
|
||||
await abort_unavailable(context, _ERR_CALENDAR_NOT_ENABLED)
|
||||
|
||||
providers = []
|
||||
for provider_name, display_name in [
|
||||
("google", "Google Calendar"),
|
||||
("outlook", "Microsoft Outlook"),
|
||||
(OAuthProvider.GOOGLE.value, "Google Calendar"),
|
||||
(OAuthProvider.OUTLOOK.value, "Microsoft Outlook"),
|
||||
]:
|
||||
status = await self._calendar_service.get_connection_status(provider_name)
|
||||
providers.append(
|
||||
noteflow_pb2.CalendarProvider(
|
||||
name=provider_name,
|
||||
is_authenticated=status.status == "connected",
|
||||
is_authenticated=status.status == IntegrationStatus.CONNECTED.value,
|
||||
display_name=display_name,
|
||||
)
|
||||
)
|
||||
@@ -96,7 +117,7 @@ class CalendarMixin:
|
||||
) -> noteflow_pb2.InitiateOAuthResponse:
|
||||
"""Start OAuth flow for a calendar provider."""
|
||||
if self._calendar_service is None:
|
||||
await abort_unavailable(context, "Calendar integration not enabled")
|
||||
await abort_unavailable(context, _ERR_CALENDAR_NOT_ENABLED)
|
||||
|
||||
try:
|
||||
auth_url, state = await self._calendar_service.initiate_oauth(
|
||||
@@ -118,7 +139,7 @@ class CalendarMixin:
|
||||
) -> noteflow_pb2.CompleteOAuthResponse:
|
||||
"""Complete OAuth flow with authorization code."""
|
||||
if self._calendar_service is None:
|
||||
await abort_unavailable(context, "Calendar integration not enabled")
|
||||
await abort_unavailable(context, _ERR_CALENDAR_NOT_ENABLED)
|
||||
|
||||
try:
|
||||
success = await self._calendar_service.complete_oauth(
|
||||
@@ -147,21 +168,14 @@ class CalendarMixin:
|
||||
) -> noteflow_pb2.GetOAuthConnectionStatusResponse:
|
||||
"""Get OAuth connection status for a provider."""
|
||||
if self._calendar_service is None:
|
||||
await abort_unavailable(context, "Calendar integration not enabled")
|
||||
await abort_unavailable(context, _ERR_CALENDAR_NOT_ENABLED)
|
||||
|
||||
status = await self._calendar_service.get_connection_status(request.provider)
|
||||
info = await self._calendar_service.get_connection_status(request.provider)
|
||||
|
||||
connection = noteflow_pb2.OAuthConnection(
|
||||
provider=status.provider,
|
||||
status=status.status,
|
||||
email=status.email or "",
|
||||
expires_at=int(status.expires_at.timestamp()) if status.expires_at else 0,
|
||||
error_message=status.error_message or "",
|
||||
integration_type=request.integration_type or "calendar",
|
||||
return noteflow_pb2.GetOAuthConnectionStatusResponse(
|
||||
connection=_build_oauth_connection(info, request.integration_type or "calendar")
|
||||
)
|
||||
|
||||
return noteflow_pb2.GetOAuthConnectionStatusResponse(connection=connection)
|
||||
|
||||
async def DisconnectOAuth(
|
||||
self: ServicerHost,
|
||||
request: noteflow_pb2.DisconnectOAuthRequest,
|
||||
@@ -169,7 +183,7 @@ class CalendarMixin:
|
||||
) -> noteflow_pb2.DisconnectOAuthResponse:
|
||||
"""Disconnect OAuth integration and revoke tokens."""
|
||||
if self._calendar_service is None:
|
||||
await abort_unavailable(context, "Calendar integration not enabled")
|
||||
await abort_unavailable(context, _ERR_CALENDAR_NOT_ENABLED)
|
||||
|
||||
success = await self._calendar_service.disconnect(request.provider)
|
||||
|
||||
|
||||
@@ -25,7 +25,7 @@ from noteflow.infrastructure.persistence.repositories import (
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .converters import parse_meeting_id, parse_meeting_id_or_abort, parse_meeting_id_or_none
|
||||
from .errors import abort_invalid_argument
|
||||
from .errors import ERR_CANCELLED_BY_USER, abort_invalid_argument
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Sequence
|
||||
@@ -127,9 +127,6 @@ class DiarizationMixin:
|
||||
Requires host to implement ServicerHost protocol.
|
||||
"""
|
||||
|
||||
# Job retention constant
|
||||
DIARIZATION_JOB_TTL_SECONDS: float = 60 * 60 # 1 hour
|
||||
|
||||
async def _process_streaming_diarization(
|
||||
self: ServicerHost,
|
||||
meeting_id: str,
|
||||
@@ -267,15 +264,9 @@ class DiarizationMixin:
|
||||
if meeting is None:
|
||||
return _create_diarization_error_response("Meeting not found")
|
||||
|
||||
meeting_state = meeting.state
|
||||
if meeting_state in (
|
||||
MeetingState.UNSPECIFIED,
|
||||
MeetingState.CREATED,
|
||||
MeetingState.RECORDING,
|
||||
MeetingState.STOPPING,
|
||||
):
|
||||
if meeting.state not in (MeetingState.STOPPED, MeetingState.COMPLETED, MeetingState.ERROR):
|
||||
return _create_diarization_error_response(
|
||||
f"Meeting must be stopped before refinement (state: {meeting_state.name.lower()})"
|
||||
f"Meeting must be stopped before refinement (state: {meeting.state.name.lower()})"
|
||||
)
|
||||
|
||||
# Check for existing active job (concurrency guard)
|
||||
@@ -292,7 +283,7 @@ class DiarizationMixin:
|
||||
job_id=active_job.job_id,
|
||||
)
|
||||
|
||||
num_speakers = request.num_speakers if request.num_speakers > 0 else None
|
||||
num_speakers = request.num_speakers or None
|
||||
|
||||
job_id = str(uuid4())
|
||||
job = DiarizationJob(
|
||||
@@ -313,13 +304,9 @@ class DiarizationMixin:
|
||||
task = asyncio.create_task(self._run_diarization_job(job_id, num_speakers))
|
||||
self._diarization_tasks[job_id] = task
|
||||
|
||||
response = noteflow_pb2.RefineSpeakerDiarizationResponse()
|
||||
response.segments_updated = 0
|
||||
response.speaker_ids[:] = []
|
||||
response.error_message = ""
|
||||
response.job_id = job_id
|
||||
response.status = noteflow_pb2.JOB_STATUS_QUEUED
|
||||
return response
|
||||
return noteflow_pb2.RefineSpeakerDiarizationResponse(
|
||||
segments_updated=0, job_id=job_id, status=noteflow_pb2.JOB_STATUS_QUEUED
|
||||
)
|
||||
|
||||
async def _run_diarization_job(
|
||||
self: ServicerHost,
|
||||
@@ -386,12 +373,12 @@ class DiarizationMixin:
|
||||
await repo.diarization_jobs.update_status(
|
||||
job_id,
|
||||
noteflow_pb2.JOB_STATUS_CANCELLED,
|
||||
error_message="Cancelled by user",
|
||||
error_message=ERR_CANCELLED_BY_USER,
|
||||
)
|
||||
await repo.commit()
|
||||
elif job is not None:
|
||||
job.status = noteflow_pb2.JOB_STATUS_CANCELLED
|
||||
job.error_message = "Cancelled by user"
|
||||
job.error_message = ERR_CANCELLED_BY_USER
|
||||
job.updated_at = datetime.now()
|
||||
raise # Re-raise to propagate cancellation
|
||||
|
||||
|
||||
@@ -13,7 +13,7 @@ import grpc
|
||||
from noteflow.domain.utils import utc_now
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .errors import abort_not_found
|
||||
from .errors import ERR_CANCELLED_BY_USER, abort_not_found
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .protocols import ServicerHost
|
||||
@@ -21,6 +21,22 @@ if TYPE_CHECKING:
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _get_diarization_job_ttl_seconds() -> float:
|
||||
"""Get diarization job TTL from settings.
|
||||
|
||||
Returns:
|
||||
TTL in seconds (defaults to 1 hour).
|
||||
"""
|
||||
try:
|
||||
from noteflow.config.settings import get_settings
|
||||
|
||||
settings = get_settings()
|
||||
return float(settings.diarization_job_ttl_hours * 3600)
|
||||
except Exception:
|
||||
# Fallback for testing without full settings
|
||||
return 3600.0 # 1 hour
|
||||
|
||||
|
||||
class _GrpcContext(Protocol):
|
||||
"""Protocol for gRPC servicer context."""
|
||||
|
||||
@@ -44,8 +60,10 @@ class DiarizationJobMixin:
|
||||
Requires host to implement ServicerHost protocol.
|
||||
"""
|
||||
|
||||
# TTL constant - must match service definition
|
||||
DIARIZATION_JOB_TTL_SECONDS: float = 3600.0
|
||||
@property
|
||||
def diarization_job_ttl_seconds(self) -> float:
|
||||
"""Return diarization job TTL from settings."""
|
||||
return _get_diarization_job_ttl_seconds()
|
||||
|
||||
async def _prune_diarization_jobs(self: ServicerHost) -> None:
|
||||
"""Remove completed diarization jobs older than retention window.
|
||||
@@ -68,7 +86,7 @@ class DiarizationJobMixin:
|
||||
async with self._create_repository_provider() as repo:
|
||||
if repo.supports_diarization_jobs:
|
||||
pruned = await repo.diarization_jobs.prune_completed(
|
||||
self.DIARIZATION_JOB_TTL_SECONDS
|
||||
self.diarization_job_ttl_seconds
|
||||
)
|
||||
await repo.commit()
|
||||
if pruned > 0:
|
||||
@@ -76,7 +94,7 @@ class DiarizationJobMixin:
|
||||
else:
|
||||
# In-memory fallback: prune from local dict.
|
||||
# Use naive datetime for comparison since in-memory jobs use naive datetimes.
|
||||
cutoff = datetime.now() - timedelta(seconds=self.DIARIZATION_JOB_TTL_SECONDS)
|
||||
cutoff = datetime.now() - timedelta(seconds=self.diarization_job_ttl_seconds)
|
||||
expired = [
|
||||
job_id
|
||||
for job_id, job in self._diarization_jobs.items()
|
||||
@@ -173,7 +191,7 @@ class DiarizationJobMixin:
|
||||
await repo.diarization_jobs.update_status(
|
||||
job_id,
|
||||
noteflow_pb2.JOB_STATUS_CANCELLED,
|
||||
error_message="Cancelled by user",
|
||||
error_message=ERR_CANCELLED_BY_USER,
|
||||
)
|
||||
await repo.commit()
|
||||
else:
|
||||
@@ -195,7 +213,7 @@ class DiarizationJobMixin:
|
||||
return response
|
||||
|
||||
job.status = noteflow_pb2.JOB_STATUS_CANCELLED
|
||||
job.error_message = "Cancelled by user"
|
||||
job.error_message = ERR_CANCELLED_BY_USER
|
||||
# Use naive datetime for in-memory store consistency
|
||||
job.updated_at = datetime.now()
|
||||
|
||||
|
||||
@@ -9,7 +9,7 @@ import grpc.aio
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .converters import parse_meeting_id_or_abort
|
||||
from .errors import abort_failed_precondition, abort_not_found
|
||||
from .errors import ENTITY_MEETING, abort_failed_precondition, abort_not_found
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from noteflow.application.services.ner_service import NerService
|
||||
@@ -54,7 +54,7 @@ class EntitiesMixin:
|
||||
)
|
||||
except ValueError:
|
||||
# Meeting not found
|
||||
await abort_not_found(context, "Meeting", request.meeting_id)
|
||||
await abort_not_found(context, ENTITY_MEETING, request.meeting_id)
|
||||
except RuntimeError as e:
|
||||
# Feature disabled
|
||||
await abort_failed_precondition(context, str(e))
|
||||
|
||||
@@ -10,6 +10,13 @@ from typing import NoReturn, Protocol
|
||||
|
||||
import grpc
|
||||
|
||||
# Common error messages used across mixins
|
||||
ERR_CANCELLED_BY_USER = "Cancelled by user"
|
||||
_ERR_UNREACHABLE = "Unreachable"
|
||||
|
||||
# Entity type names for abort_not_found calls
|
||||
ENTITY_MEETING = "Meeting"
|
||||
|
||||
|
||||
class _AbortableContext(Protocol):
|
||||
"""Minimal protocol for gRPC context abort operations.
|
||||
@@ -45,7 +52,7 @@ async def abort_not_found(
|
||||
f"{entity_type} {entity_id} not found",
|
||||
)
|
||||
# This line is unreachable but helps type checkers
|
||||
raise AssertionError("Unreachable")
|
||||
raise AssertionError(_ERR_UNREACHABLE)
|
||||
|
||||
|
||||
async def abort_database_required(
|
||||
@@ -67,7 +74,7 @@ async def abort_database_required(
|
||||
grpc.StatusCode.UNIMPLEMENTED,
|
||||
f"{feature} require database persistence",
|
||||
)
|
||||
raise AssertionError("Unreachable")
|
||||
raise AssertionError(_ERR_UNREACHABLE)
|
||||
|
||||
|
||||
async def abort_invalid_argument(
|
||||
@@ -84,7 +91,7 @@ async def abort_invalid_argument(
|
||||
grpc.RpcError: Always raises INVALID_ARGUMENT.
|
||||
"""
|
||||
await context.abort(grpc.StatusCode.INVALID_ARGUMENT, message)
|
||||
raise AssertionError("Unreachable")
|
||||
raise AssertionError(_ERR_UNREACHABLE)
|
||||
|
||||
|
||||
async def abort_failed_precondition(
|
||||
@@ -103,7 +110,7 @@ async def abort_failed_precondition(
|
||||
grpc.RpcError: Always raises FAILED_PRECONDITION.
|
||||
"""
|
||||
await context.abort(grpc.StatusCode.FAILED_PRECONDITION, message)
|
||||
raise AssertionError("Unreachable")
|
||||
raise AssertionError(_ERR_UNREACHABLE)
|
||||
|
||||
|
||||
async def abort_internal(
|
||||
@@ -122,7 +129,7 @@ async def abort_internal(
|
||||
grpc.RpcError: Always raises INTERNAL.
|
||||
"""
|
||||
await context.abort(grpc.StatusCode.INTERNAL, message)
|
||||
raise AssertionError("Unreachable")
|
||||
raise AssertionError(_ERR_UNREACHABLE)
|
||||
|
||||
|
||||
async def abort_already_exists(
|
||||
@@ -141,7 +148,7 @@ async def abort_already_exists(
|
||||
grpc.RpcError: Always raises ALREADY_EXISTS.
|
||||
"""
|
||||
await context.abort(grpc.StatusCode.ALREADY_EXISTS, message)
|
||||
raise AssertionError("Unreachable")
|
||||
raise AssertionError(_ERR_UNREACHABLE)
|
||||
|
||||
|
||||
async def abort_unavailable(
|
||||
@@ -160,4 +167,4 @@ async def abort_unavailable(
|
||||
grpc.RpcError: Always raises UNAVAILABLE.
|
||||
"""
|
||||
await context.abort(grpc.StatusCode.UNAVAILABLE, message)
|
||||
raise AssertionError("Unreachable")
|
||||
raise AssertionError(_ERR_UNREACHABLE)
|
||||
|
||||
@@ -8,10 +8,11 @@ from typing import TYPE_CHECKING
|
||||
import grpc.aio
|
||||
|
||||
from noteflow.application.services.export_service import ExportFormat, ExportService
|
||||
from noteflow.config.constants import EXPORT_EXT_HTML, EXPORT_EXT_PDF, EXPORT_FORMAT_HTML
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .converters import parse_meeting_id_or_abort, proto_to_export_format
|
||||
from .errors import abort_not_found
|
||||
from .errors import ENTITY_MEETING, abort_not_found
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .protocols import ServicerHost
|
||||
@@ -19,8 +20,8 @@ if TYPE_CHECKING:
|
||||
# Format metadata lookup
|
||||
_FORMAT_METADATA: dict[ExportFormat, tuple[str, str]] = {
|
||||
ExportFormat.MARKDOWN: ("Markdown", ".md"),
|
||||
ExportFormat.HTML: ("HTML", ".html"),
|
||||
ExportFormat.PDF: ("PDF", ".pdf"),
|
||||
ExportFormat.HTML: (EXPORT_FORMAT_HTML, EXPORT_EXT_HTML),
|
||||
ExportFormat.PDF: ("PDF", EXPORT_EXT_PDF),
|
||||
}
|
||||
|
||||
|
||||
@@ -66,4 +67,4 @@ class ExportMixin:
|
||||
file_extension=fmt_ext,
|
||||
)
|
||||
except ValueError:
|
||||
await abort_not_found(context, "Meeting", request.meeting_id)
|
||||
await abort_not_found(context, ENTITY_MEETING, request.meeting_id)
|
||||
|
||||
@@ -3,20 +3,24 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
import grpc.aio
|
||||
|
||||
from noteflow.config.constants import DEFAULT_MEETING_TITLE
|
||||
from noteflow.domain.entities import Meeting
|
||||
from noteflow.domain.value_objects import MeetingState
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .converters import meeting_to_proto, parse_meeting_id_or_abort
|
||||
from .errors import abort_invalid_argument, abort_not_found
|
||||
from .errors import ENTITY_MEETING, abort_invalid_argument, abort_not_found
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .protocols import ServicerHost
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Timeout for waiting for stream to exit gracefully
|
||||
STOP_WAIT_TIMEOUT_SECONDS: float = 2.0
|
||||
|
||||
@@ -74,7 +78,7 @@ class MeetingMixin:
|
||||
async with self._create_repository_provider() as repo:
|
||||
meeting = await repo.meetings.get(parsed_meeting_id)
|
||||
if meeting is None:
|
||||
await abort_not_found(context, "Meeting", meeting_id)
|
||||
await abort_not_found(context, ENTITY_MEETING, meeting_id)
|
||||
try:
|
||||
# Graceful shutdown: RECORDING -> STOPPING -> STOPPED
|
||||
meeting.begin_stopping()
|
||||
@@ -86,6 +90,23 @@ class MeetingMixin:
|
||||
if repo.supports_diarization_jobs:
|
||||
await repo.diarization_jobs.clear_streaming_turns(meeting_id)
|
||||
await repo.commit()
|
||||
|
||||
# Trigger webhooks (fire-and-forget)
|
||||
if self._webhook_service is not None:
|
||||
try:
|
||||
await self._webhook_service.trigger_recording_stopped(
|
||||
meeting_id=meeting_id,
|
||||
title=meeting.title or DEFAULT_MEETING_TITLE,
|
||||
duration_seconds=meeting.duration_seconds or 0.0,
|
||||
)
|
||||
except Exception:
|
||||
logger.exception("Failed to trigger recording.stopped webhooks")
|
||||
|
||||
try:
|
||||
await self._webhook_service.trigger_meeting_completed(meeting)
|
||||
except Exception:
|
||||
logger.exception("Failed to trigger meeting.completed webhooks")
|
||||
|
||||
return meeting_to_proto(meeting)
|
||||
|
||||
async def ListMeetings(
|
||||
@@ -121,7 +142,7 @@ class MeetingMixin:
|
||||
async with self._create_repository_provider() as repo:
|
||||
meeting = await repo.meetings.get(meeting_id)
|
||||
if meeting is None:
|
||||
await abort_not_found(context, "Meeting", request.meeting_id)
|
||||
await abort_not_found(context, ENTITY_MEETING, request.meeting_id)
|
||||
# Load segments if requested
|
||||
if request.include_segments:
|
||||
segments = await repo.segments.get_by_meeting(meeting.id)
|
||||
@@ -148,4 +169,4 @@ class MeetingMixin:
|
||||
if success:
|
||||
await repo.commit()
|
||||
return noteflow_pb2.DeleteMeetingResponse(success=True)
|
||||
await abort_not_found(context, "Meeting", request.meeting_id)
|
||||
await abort_not_found(context, ENTITY_MEETING, request.meeting_id)
|
||||
|
||||
@@ -14,6 +14,7 @@ if TYPE_CHECKING:
|
||||
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
from noteflow.application.services.ner_service import NerService
|
||||
from noteflow.application.services.webhook_service import WebhookService
|
||||
from noteflow.domain.entities import Meeting
|
||||
from noteflow.domain.ports.unit_of_work import UnitOfWork
|
||||
from noteflow.infrastructure.asr import FasterWhisperEngine, Segmenter, StreamingVad
|
||||
@@ -49,6 +50,7 @@ class ServicerHost(Protocol):
|
||||
_summarization_service: object | None
|
||||
_ner_service: NerService | None
|
||||
_calendar_service: CalendarService | None
|
||||
_webhook_service: WebhookService | None
|
||||
_diarization_refinement_enabled: bool
|
||||
|
||||
# Audio writers
|
||||
@@ -85,7 +87,11 @@ class ServicerHost(Protocol):
|
||||
SUPPORTED_SAMPLE_RATES: list[int]
|
||||
PARTIAL_CADENCE_SECONDS: float
|
||||
MIN_PARTIAL_AUDIO_SECONDS: float
|
||||
DIARIZATION_JOB_TTL_SECONDS: float
|
||||
|
||||
@property
|
||||
def diarization_job_ttl_seconds(self) -> float:
|
||||
"""Return diarization job TTL from settings."""
|
||||
...
|
||||
|
||||
def _use_database(self) -> bool:
|
||||
"""Check if database persistence is configured."""
|
||||
|
||||
@@ -13,6 +13,7 @@ import grpc.aio
|
||||
import numpy as np
|
||||
from numpy.typing import NDArray
|
||||
|
||||
from noteflow.config.constants import DEFAULT_MEETING_TITLE
|
||||
from noteflow.infrastructure.diarization import SpeakerTurn
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
@@ -185,6 +186,16 @@ class StreamingMixin:
|
||||
await repo.meetings.update(meeting)
|
||||
await repo.commit()
|
||||
|
||||
# Trigger recording.started webhook when meeting transitions to RECORDING
|
||||
if recording_updated and self._webhook_service is not None:
|
||||
try:
|
||||
await self._webhook_service.trigger_recording_started(
|
||||
meeting_id=meeting_id,
|
||||
title=meeting.title or DEFAULT_MEETING_TITLE,
|
||||
)
|
||||
except Exception:
|
||||
logger.exception("Failed to trigger recording.started webhooks")
|
||||
|
||||
next_segment_id = await repo.segments.compute_next_segment_id(meeting.id)
|
||||
self._open_meeting_audio_writer(
|
||||
meeting_id, dek, wrapped_dek, asset_path=meeting.asset_path
|
||||
|
||||
@@ -14,7 +14,7 @@ from noteflow.infrastructure.summarization._parsing import build_style_prompt
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .converters import parse_meeting_id_or_abort, summary_to_proto
|
||||
from .errors import abort_not_found
|
||||
from .errors import ENTITY_MEETING, abort_not_found
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from noteflow.application.services.summarization_service import SummarizationService
|
||||
@@ -58,7 +58,7 @@ class SummarizationMixin:
|
||||
async with self._create_repository_provider() as repo:
|
||||
meeting = await repo.meetings.get(meeting_id)
|
||||
if meeting is None:
|
||||
await abort_not_found(context, "Meeting", request.meeting_id)
|
||||
await abort_not_found(context, ENTITY_MEETING, request.meeting_id)
|
||||
|
||||
existing = await repo.summaries.get_by_meeting(meeting.id)
|
||||
if existing and not request.force_regenerate:
|
||||
@@ -74,6 +74,15 @@ class SummarizationMixin:
|
||||
saved = await repo.summaries.save(summary)
|
||||
await repo.commit()
|
||||
|
||||
# Trigger summary.generated webhook (fire-and-forget)
|
||||
if self._webhook_service is not None:
|
||||
try:
|
||||
# Attach saved summary to meeting for webhook payload
|
||||
meeting.summary = saved
|
||||
await self._webhook_service.trigger_summary_generated(meeting)
|
||||
except Exception:
|
||||
logger.exception("Failed to trigger summary.generated webhooks")
|
||||
|
||||
return summary_to_proto(saved)
|
||||
|
||||
async def _summarize_or_placeholder(
|
||||
|
||||
227
src/noteflow/grpc/_mixins/webhooks.py
Normal file
227
src/noteflow/grpc/_mixins/webhooks.py
Normal file
@@ -0,0 +1,227 @@
|
||||
"""Webhook management mixin for gRPC service."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import replace
|
||||
from typing import TYPE_CHECKING
|
||||
from uuid import UUID
|
||||
|
||||
import grpc.aio
|
||||
|
||||
from noteflow.domain.utils.time import utc_now
|
||||
from noteflow.domain.webhooks.events import (
|
||||
WebhookConfig,
|
||||
WebhookDelivery,
|
||||
WebhookEventType,
|
||||
)
|
||||
|
||||
from ..proto import noteflow_pb2
|
||||
from .errors import abort_database_required, abort_invalid_argument, abort_not_found
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from .protocols import ServicerHost
|
||||
|
||||
# Entity type names for error messages
|
||||
_ENTITY_WEBHOOK = "Webhook"
|
||||
_ENTITY_WEBHOOKS = "Webhooks"
|
||||
_ERR_INVALID_WEBHOOK_ID = "Invalid webhook_id format"
|
||||
|
||||
|
||||
def _webhook_config_to_proto(config: WebhookConfig) -> noteflow_pb2.WebhookConfigProto:
|
||||
"""Convert domain WebhookConfig to proto message."""
|
||||
return noteflow_pb2.WebhookConfigProto(
|
||||
id=str(config.id),
|
||||
workspace_id=str(config.workspace_id),
|
||||
name=config.name,
|
||||
url=config.url,
|
||||
events=[e.value for e in config.events],
|
||||
enabled=config.enabled,
|
||||
timeout_ms=config.timeout_ms,
|
||||
max_retries=config.max_retries,
|
||||
created_at=int(config.created_at.timestamp()),
|
||||
updated_at=int(config.updated_at.timestamp()),
|
||||
)
|
||||
|
||||
|
||||
def _webhook_delivery_to_proto(
|
||||
delivery: WebhookDelivery,
|
||||
) -> noteflow_pb2.WebhookDeliveryProto:
|
||||
"""Convert domain WebhookDelivery to proto message."""
|
||||
return noteflow_pb2.WebhookDeliveryProto(
|
||||
id=str(delivery.id),
|
||||
webhook_id=str(delivery.webhook_id),
|
||||
event_type=delivery.event_type.value,
|
||||
status_code=delivery.status_code or 0,
|
||||
error_message=delivery.error_message or "",
|
||||
attempt_count=delivery.attempt_count,
|
||||
duration_ms=delivery.duration_ms or 0,
|
||||
delivered_at=int(delivery.delivered_at.timestamp()),
|
||||
succeeded=delivery.succeeded,
|
||||
)
|
||||
|
||||
|
||||
def _parse_webhook_id(webhook_id_str: str) -> UUID:
|
||||
"""Parse webhook ID string to UUID, raising ValueError if invalid."""
|
||||
return UUID(webhook_id_str)
|
||||
|
||||
|
||||
def _parse_events(event_strings: list[str]) -> frozenset[WebhookEventType]:
|
||||
"""Parse event type strings to WebhookEventType enum values."""
|
||||
return frozenset(WebhookEventType(e) for e in event_strings)
|
||||
|
||||
|
||||
class WebhooksMixin:
|
||||
"""Mixin providing webhook CRUD operations.
|
||||
|
||||
Requires host to implement ServicerHost protocol.
|
||||
Webhooks require database persistence.
|
||||
"""
|
||||
|
||||
async def RegisterWebhook(
|
||||
self: ServicerHost,
|
||||
request: noteflow_pb2.RegisterWebhookRequest,
|
||||
context: grpc.aio.ServicerContext,
|
||||
) -> noteflow_pb2.WebhookConfigProto:
|
||||
"""Register a new webhook configuration."""
|
||||
# Validate URL
|
||||
if not request.url or not request.url.startswith(("http://", "https://")):
|
||||
await abort_invalid_argument(
|
||||
context, "URL must start with http:// or https://"
|
||||
)
|
||||
|
||||
# Validate events
|
||||
if not request.events:
|
||||
await abort_invalid_argument(context, "At least one event type required")
|
||||
|
||||
try:
|
||||
events = _parse_events(list(request.events))
|
||||
except ValueError as exc:
|
||||
await abort_invalid_argument(context, f"Invalid event type: {exc}")
|
||||
|
||||
try:
|
||||
workspace_id = UUID(request.workspace_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, "Invalid workspace_id format")
|
||||
|
||||
async with self._create_repository_provider() as uow:
|
||||
if not uow.supports_webhooks:
|
||||
await abort_database_required(context, _ENTITY_WEBHOOKS)
|
||||
|
||||
config = WebhookConfig.create(
|
||||
workspace_id=workspace_id,
|
||||
url=request.url,
|
||||
events=list(events),
|
||||
name=request.name or "Webhook",
|
||||
secret=request.secret if request.secret else None,
|
||||
timeout_ms=request.timeout_ms or 10000,
|
||||
max_retries=request.max_retries or 3,
|
||||
)
|
||||
saved = await uow.webhooks.create(config)
|
||||
await uow.commit()
|
||||
return _webhook_config_to_proto(saved)
|
||||
|
||||
async def ListWebhooks(
|
||||
self: ServicerHost,
|
||||
request: noteflow_pb2.ListWebhooksRequest,
|
||||
context: grpc.aio.ServicerContext,
|
||||
) -> noteflow_pb2.ListWebhooksResponse:
|
||||
"""List registered webhooks."""
|
||||
async with self._create_repository_provider() as uow:
|
||||
if not uow.supports_webhooks:
|
||||
await abort_database_required(context, _ENTITY_WEBHOOKS)
|
||||
|
||||
if request.enabled_only:
|
||||
webhooks = await uow.webhooks.get_all_enabled()
|
||||
else:
|
||||
webhooks = await uow.webhooks.get_all()
|
||||
|
||||
return noteflow_pb2.ListWebhooksResponse(
|
||||
webhooks=[_webhook_config_to_proto(w) for w in webhooks],
|
||||
total_count=len(webhooks),
|
||||
)
|
||||
|
||||
async def UpdateWebhook(
|
||||
self: ServicerHost,
|
||||
request: noteflow_pb2.UpdateWebhookRequest,
|
||||
context: grpc.aio.ServicerContext,
|
||||
) -> noteflow_pb2.WebhookConfigProto:
|
||||
"""Update an existing webhook configuration."""
|
||||
try:
|
||||
webhook_id = _parse_webhook_id(request.webhook_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, _ERR_INVALID_WEBHOOK_ID)
|
||||
|
||||
async with self._create_repository_provider() as uow:
|
||||
if not uow.supports_webhooks:
|
||||
await abort_database_required(context, _ENTITY_WEBHOOKS)
|
||||
|
||||
config = await uow.webhooks.get_by_id(webhook_id)
|
||||
if config is None:
|
||||
await abort_not_found(context, _ENTITY_WEBHOOK, request.webhook_id)
|
||||
|
||||
# Build updates dict with proper typing
|
||||
updates: dict[
|
||||
str, str | frozenset[WebhookEventType] | bool | int | None
|
||||
] = {}
|
||||
|
||||
if request.HasField("url"):
|
||||
updates["url"] = request.url
|
||||
if request.events: # repeated fields don't use HasField
|
||||
updates["events"] = _parse_events(list(request.events))
|
||||
if request.HasField("name"):
|
||||
updates["name"] = request.name
|
||||
if request.HasField("enabled"):
|
||||
updates["enabled"] = request.enabled
|
||||
if request.HasField("timeout_ms"):
|
||||
updates["timeout_ms"] = request.timeout_ms
|
||||
if request.HasField("max_retries"):
|
||||
updates["max_retries"] = request.max_retries
|
||||
if request.HasField("secret"):
|
||||
updates["secret"] = request.secret
|
||||
|
||||
updated = replace(config, **updates, updated_at=utc_now())
|
||||
saved = await uow.webhooks.update(updated)
|
||||
await uow.commit()
|
||||
return _webhook_config_to_proto(saved)
|
||||
|
||||
async def DeleteWebhook(
|
||||
self: ServicerHost,
|
||||
request: noteflow_pb2.DeleteWebhookRequest,
|
||||
context: grpc.aio.ServicerContext,
|
||||
) -> noteflow_pb2.DeleteWebhookResponse:
|
||||
"""Delete a webhook configuration."""
|
||||
try:
|
||||
webhook_id = _parse_webhook_id(request.webhook_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, _ERR_INVALID_WEBHOOK_ID)
|
||||
|
||||
async with self._create_repository_provider() as uow:
|
||||
if not uow.supports_webhooks:
|
||||
await abort_database_required(context, _ENTITY_WEBHOOKS)
|
||||
|
||||
deleted = await uow.webhooks.delete(webhook_id)
|
||||
await uow.commit()
|
||||
return noteflow_pb2.DeleteWebhookResponse(success=deleted)
|
||||
|
||||
async def GetWebhookDeliveries(
|
||||
self: ServicerHost,
|
||||
request: noteflow_pb2.GetWebhookDeliveriesRequest,
|
||||
context: grpc.aio.ServicerContext,
|
||||
) -> noteflow_pb2.GetWebhookDeliveriesResponse:
|
||||
"""Get delivery history for a webhook."""
|
||||
try:
|
||||
webhook_id = _parse_webhook_id(request.webhook_id)
|
||||
except ValueError:
|
||||
await abort_invalid_argument(context, _ERR_INVALID_WEBHOOK_ID)
|
||||
|
||||
limit = min(request.limit or 50, 500)
|
||||
|
||||
async with self._create_repository_provider() as uow:
|
||||
if not uow.supports_webhooks:
|
||||
await abort_database_required(context, _ENTITY_WEBHOOKS)
|
||||
|
||||
deliveries = await uow.webhooks.get_deliveries(webhook_id, limit=limit)
|
||||
return noteflow_pb2.GetWebhookDeliveriesResponse(
|
||||
deliveries=[_webhook_delivery_to_proto(d) for d in deliveries],
|
||||
total_count=len(deliveries),
|
||||
)
|
||||
@@ -54,6 +54,13 @@ service NoteFlowService {
|
||||
rpc CompleteOAuth(CompleteOAuthRequest) returns (CompleteOAuthResponse);
|
||||
rpc GetOAuthConnectionStatus(GetOAuthConnectionStatusRequest) returns (GetOAuthConnectionStatusResponse);
|
||||
rpc DisconnectOAuth(DisconnectOAuthRequest) returns (DisconnectOAuthResponse);
|
||||
|
||||
// Webhook management (Sprint 6)
|
||||
rpc RegisterWebhook(RegisterWebhookRequest) returns (WebhookConfigProto);
|
||||
rpc ListWebhooks(ListWebhooksRequest) returns (ListWebhooksResponse);
|
||||
rpc UpdateWebhook(UpdateWebhookRequest) returns (WebhookConfigProto);
|
||||
rpc DeleteWebhook(DeleteWebhookRequest) returns (DeleteWebhookResponse);
|
||||
rpc GetWebhookDeliveries(GetWebhookDeliveriesRequest) returns (GetWebhookDeliveriesResponse);
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
@@ -799,3 +806,156 @@ message DisconnectOAuthResponse {
|
||||
// Error message if failed
|
||||
string error_message = 2;
|
||||
}
|
||||
|
||||
// =============================================================================
|
||||
// Webhook Management Messages (Sprint 6)
|
||||
// =============================================================================
|
||||
|
||||
message RegisterWebhookRequest {
|
||||
// Workspace this webhook belongs to
|
||||
string workspace_id = 1;
|
||||
|
||||
// Target URL for webhook delivery
|
||||
string url = 2;
|
||||
|
||||
// Events to subscribe to: meeting.completed, summary.generated, recording.started, recording.stopped
|
||||
repeated string events = 3;
|
||||
|
||||
// Human-readable webhook name
|
||||
string name = 4;
|
||||
|
||||
// Optional HMAC signing secret
|
||||
string secret = 5;
|
||||
|
||||
// Request timeout in milliseconds (default: 10000)
|
||||
int32 timeout_ms = 6;
|
||||
|
||||
// Maximum retry attempts (default: 3)
|
||||
int32 max_retries = 7;
|
||||
}
|
||||
|
||||
message WebhookConfigProto {
|
||||
// Unique webhook identifier
|
||||
string id = 1;
|
||||
|
||||
// Workspace this webhook belongs to
|
||||
string workspace_id = 2;
|
||||
|
||||
// Human-readable webhook name
|
||||
string name = 3;
|
||||
|
||||
// Target URL for webhook delivery
|
||||
string url = 4;
|
||||
|
||||
// Subscribed event types
|
||||
repeated string events = 5;
|
||||
|
||||
// Whether webhook is enabled
|
||||
bool enabled = 6;
|
||||
|
||||
// Request timeout in milliseconds
|
||||
int32 timeout_ms = 7;
|
||||
|
||||
// Maximum retry attempts
|
||||
int32 max_retries = 8;
|
||||
|
||||
// Creation timestamp (Unix epoch seconds)
|
||||
int64 created_at = 9;
|
||||
|
||||
// Last update timestamp (Unix epoch seconds)
|
||||
int64 updated_at = 10;
|
||||
}
|
||||
|
||||
message ListWebhooksRequest {
|
||||
// Filter to only enabled webhooks
|
||||
bool enabled_only = 1;
|
||||
}
|
||||
|
||||
message ListWebhooksResponse {
|
||||
// Registered webhooks
|
||||
repeated WebhookConfigProto webhooks = 1;
|
||||
|
||||
// Total webhook count
|
||||
int32 total_count = 2;
|
||||
}
|
||||
|
||||
message UpdateWebhookRequest {
|
||||
// Webhook ID to update
|
||||
string webhook_id = 1;
|
||||
|
||||
// Updated URL (optional)
|
||||
optional string url = 2;
|
||||
|
||||
// Updated events (replaces existing)
|
||||
repeated string events = 3;
|
||||
|
||||
// Updated name (optional)
|
||||
optional string name = 4;
|
||||
|
||||
// Updated secret (optional)
|
||||
optional string secret = 5;
|
||||
|
||||
// Updated enabled status (optional)
|
||||
optional bool enabled = 6;
|
||||
|
||||
// Updated timeout in milliseconds (optional)
|
||||
optional int32 timeout_ms = 7;
|
||||
|
||||
// Updated max retries (optional)
|
||||
optional int32 max_retries = 8;
|
||||
}
|
||||
|
||||
message DeleteWebhookRequest {
|
||||
// Webhook ID to delete
|
||||
string webhook_id = 1;
|
||||
}
|
||||
|
||||
message DeleteWebhookResponse {
|
||||
// Whether deletion succeeded
|
||||
bool success = 1;
|
||||
}
|
||||
|
||||
message WebhookDeliveryProto {
|
||||
// Unique delivery identifier
|
||||
string id = 1;
|
||||
|
||||
// Webhook ID this delivery belongs to
|
||||
string webhook_id = 2;
|
||||
|
||||
// Event type that triggered this delivery
|
||||
string event_type = 3;
|
||||
|
||||
// HTTP status code (0 if no response)
|
||||
int32 status_code = 4;
|
||||
|
||||
// Error message if delivery failed
|
||||
string error_message = 5;
|
||||
|
||||
// Number of delivery attempts
|
||||
int32 attempt_count = 6;
|
||||
|
||||
// Request duration in milliseconds
|
||||
int32 duration_ms = 7;
|
||||
|
||||
// Delivery timestamp (Unix epoch seconds)
|
||||
int64 delivered_at = 8;
|
||||
|
||||
// Whether delivery succeeded
|
||||
bool succeeded = 9;
|
||||
}
|
||||
|
||||
message GetWebhookDeliveriesRequest {
|
||||
// Webhook ID to get deliveries for
|
||||
string webhook_id = 1;
|
||||
|
||||
// Maximum deliveries to return (default: 50, max: 500)
|
||||
int32 limit = 2;
|
||||
}
|
||||
|
||||
message GetWebhookDeliveriesResponse {
|
||||
// Recent webhook deliveries
|
||||
repeated WebhookDeliveryProto deliveries = 1;
|
||||
|
||||
// Total delivery count
|
||||
int32 total_count = 2;
|
||||
}
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -703,3 +703,129 @@ class DisconnectOAuthResponse(_message.Message):
|
||||
success: bool
|
||||
error_message: str
|
||||
def __init__(self, success: bool = ..., error_message: _Optional[str] = ...) -> None: ...
|
||||
|
||||
class RegisterWebhookRequest(_message.Message):
|
||||
__slots__ = ("workspace_id", "url", "events", "name", "secret", "timeout_ms", "max_retries")
|
||||
WORKSPACE_ID_FIELD_NUMBER: _ClassVar[int]
|
||||
URL_FIELD_NUMBER: _ClassVar[int]
|
||||
EVENTS_FIELD_NUMBER: _ClassVar[int]
|
||||
NAME_FIELD_NUMBER: _ClassVar[int]
|
||||
SECRET_FIELD_NUMBER: _ClassVar[int]
|
||||
TIMEOUT_MS_FIELD_NUMBER: _ClassVar[int]
|
||||
MAX_RETRIES_FIELD_NUMBER: _ClassVar[int]
|
||||
workspace_id: str
|
||||
url: str
|
||||
events: _containers.RepeatedScalarFieldContainer[str]
|
||||
name: str
|
||||
secret: str
|
||||
timeout_ms: int
|
||||
max_retries: int
|
||||
def __init__(self, workspace_id: _Optional[str] = ..., url: _Optional[str] = ..., events: _Optional[_Iterable[str]] = ..., name: _Optional[str] = ..., secret: _Optional[str] = ..., timeout_ms: _Optional[int] = ..., max_retries: _Optional[int] = ...) -> None: ...
|
||||
|
||||
class WebhookConfigProto(_message.Message):
|
||||
__slots__ = ("id", "workspace_id", "name", "url", "events", "enabled", "timeout_ms", "max_retries", "created_at", "updated_at")
|
||||
ID_FIELD_NUMBER: _ClassVar[int]
|
||||
WORKSPACE_ID_FIELD_NUMBER: _ClassVar[int]
|
||||
NAME_FIELD_NUMBER: _ClassVar[int]
|
||||
URL_FIELD_NUMBER: _ClassVar[int]
|
||||
EVENTS_FIELD_NUMBER: _ClassVar[int]
|
||||
ENABLED_FIELD_NUMBER: _ClassVar[int]
|
||||
TIMEOUT_MS_FIELD_NUMBER: _ClassVar[int]
|
||||
MAX_RETRIES_FIELD_NUMBER: _ClassVar[int]
|
||||
CREATED_AT_FIELD_NUMBER: _ClassVar[int]
|
||||
UPDATED_AT_FIELD_NUMBER: _ClassVar[int]
|
||||
id: str
|
||||
workspace_id: str
|
||||
name: str
|
||||
url: str
|
||||
events: _containers.RepeatedScalarFieldContainer[str]
|
||||
enabled: bool
|
||||
timeout_ms: int
|
||||
max_retries: int
|
||||
created_at: int
|
||||
updated_at: int
|
||||
def __init__(self, id: _Optional[str] = ..., workspace_id: _Optional[str] = ..., name: _Optional[str] = ..., url: _Optional[str] = ..., events: _Optional[_Iterable[str]] = ..., enabled: bool = ..., timeout_ms: _Optional[int] = ..., max_retries: _Optional[int] = ..., created_at: _Optional[int] = ..., updated_at: _Optional[int] = ...) -> None: ...
|
||||
|
||||
class ListWebhooksRequest(_message.Message):
|
||||
__slots__ = ("enabled_only",)
|
||||
ENABLED_ONLY_FIELD_NUMBER: _ClassVar[int]
|
||||
enabled_only: bool
|
||||
def __init__(self, enabled_only: bool = ...) -> None: ...
|
||||
|
||||
class ListWebhooksResponse(_message.Message):
|
||||
__slots__ = ("webhooks", "total_count")
|
||||
WEBHOOKS_FIELD_NUMBER: _ClassVar[int]
|
||||
TOTAL_COUNT_FIELD_NUMBER: _ClassVar[int]
|
||||
webhooks: _containers.RepeatedCompositeFieldContainer[WebhookConfigProto]
|
||||
total_count: int
|
||||
def __init__(self, webhooks: _Optional[_Iterable[_Union[WebhookConfigProto, _Mapping]]] = ..., total_count: _Optional[int] = ...) -> None: ...
|
||||
|
||||
class UpdateWebhookRequest(_message.Message):
|
||||
__slots__ = ("webhook_id", "url", "events", "name", "secret", "enabled", "timeout_ms", "max_retries")
|
||||
WEBHOOK_ID_FIELD_NUMBER: _ClassVar[int]
|
||||
URL_FIELD_NUMBER: _ClassVar[int]
|
||||
EVENTS_FIELD_NUMBER: _ClassVar[int]
|
||||
NAME_FIELD_NUMBER: _ClassVar[int]
|
||||
SECRET_FIELD_NUMBER: _ClassVar[int]
|
||||
ENABLED_FIELD_NUMBER: _ClassVar[int]
|
||||
TIMEOUT_MS_FIELD_NUMBER: _ClassVar[int]
|
||||
MAX_RETRIES_FIELD_NUMBER: _ClassVar[int]
|
||||
webhook_id: str
|
||||
url: str
|
||||
events: _containers.RepeatedScalarFieldContainer[str]
|
||||
name: str
|
||||
secret: str
|
||||
enabled: bool
|
||||
timeout_ms: int
|
||||
max_retries: int
|
||||
def __init__(self, webhook_id: _Optional[str] = ..., url: _Optional[str] = ..., events: _Optional[_Iterable[str]] = ..., name: _Optional[str] = ..., secret: _Optional[str] = ..., enabled: bool = ..., timeout_ms: _Optional[int] = ..., max_retries: _Optional[int] = ...) -> None: ...
|
||||
|
||||
class DeleteWebhookRequest(_message.Message):
|
||||
__slots__ = ("webhook_id",)
|
||||
WEBHOOK_ID_FIELD_NUMBER: _ClassVar[int]
|
||||
webhook_id: str
|
||||
def __init__(self, webhook_id: _Optional[str] = ...) -> None: ...
|
||||
|
||||
class DeleteWebhookResponse(_message.Message):
|
||||
__slots__ = ("success",)
|
||||
SUCCESS_FIELD_NUMBER: _ClassVar[int]
|
||||
success: bool
|
||||
def __init__(self, success: bool = ...) -> None: ...
|
||||
|
||||
class WebhookDeliveryProto(_message.Message):
|
||||
__slots__ = ("id", "webhook_id", "event_type", "status_code", "error_message", "attempt_count", "duration_ms", "delivered_at", "succeeded")
|
||||
ID_FIELD_NUMBER: _ClassVar[int]
|
||||
WEBHOOK_ID_FIELD_NUMBER: _ClassVar[int]
|
||||
EVENT_TYPE_FIELD_NUMBER: _ClassVar[int]
|
||||
STATUS_CODE_FIELD_NUMBER: _ClassVar[int]
|
||||
ERROR_MESSAGE_FIELD_NUMBER: _ClassVar[int]
|
||||
ATTEMPT_COUNT_FIELD_NUMBER: _ClassVar[int]
|
||||
DURATION_MS_FIELD_NUMBER: _ClassVar[int]
|
||||
DELIVERED_AT_FIELD_NUMBER: _ClassVar[int]
|
||||
SUCCEEDED_FIELD_NUMBER: _ClassVar[int]
|
||||
id: str
|
||||
webhook_id: str
|
||||
event_type: str
|
||||
status_code: int
|
||||
error_message: str
|
||||
attempt_count: int
|
||||
duration_ms: int
|
||||
delivered_at: int
|
||||
succeeded: bool
|
||||
def __init__(self, id: _Optional[str] = ..., webhook_id: _Optional[str] = ..., event_type: _Optional[str] = ..., status_code: _Optional[int] = ..., error_message: _Optional[str] = ..., attempt_count: _Optional[int] = ..., duration_ms: _Optional[int] = ..., delivered_at: _Optional[int] = ..., succeeded: bool = ...) -> None: ...
|
||||
|
||||
class GetWebhookDeliveriesRequest(_message.Message):
|
||||
__slots__ = ("webhook_id", "limit")
|
||||
WEBHOOK_ID_FIELD_NUMBER: _ClassVar[int]
|
||||
LIMIT_FIELD_NUMBER: _ClassVar[int]
|
||||
webhook_id: str
|
||||
limit: int
|
||||
def __init__(self, webhook_id: _Optional[str] = ..., limit: _Optional[int] = ...) -> None: ...
|
||||
|
||||
class GetWebhookDeliveriesResponse(_message.Message):
|
||||
__slots__ = ("deliveries", "total_count")
|
||||
DELIVERIES_FIELD_NUMBER: _ClassVar[int]
|
||||
TOTAL_COUNT_FIELD_NUMBER: _ClassVar[int]
|
||||
deliveries: _containers.RepeatedCompositeFieldContainer[WebhookDeliveryProto]
|
||||
total_count: int
|
||||
def __init__(self, deliveries: _Optional[_Iterable[_Union[WebhookDeliveryProto, _Mapping]]] = ..., total_count: _Optional[int] = ...) -> None: ...
|
||||
|
||||
@@ -163,6 +163,31 @@ class NoteFlowServiceStub(object):
|
||||
request_serializer=noteflow__pb2.DisconnectOAuthRequest.SerializeToString,
|
||||
response_deserializer=noteflow__pb2.DisconnectOAuthResponse.FromString,
|
||||
_registered_method=True)
|
||||
self.RegisterWebhook = channel.unary_unary(
|
||||
'/noteflow.NoteFlowService/RegisterWebhook',
|
||||
request_serializer=noteflow__pb2.RegisterWebhookRequest.SerializeToString,
|
||||
response_deserializer=noteflow__pb2.WebhookConfigProto.FromString,
|
||||
_registered_method=True)
|
||||
self.ListWebhooks = channel.unary_unary(
|
||||
'/noteflow.NoteFlowService/ListWebhooks',
|
||||
request_serializer=noteflow__pb2.ListWebhooksRequest.SerializeToString,
|
||||
response_deserializer=noteflow__pb2.ListWebhooksResponse.FromString,
|
||||
_registered_method=True)
|
||||
self.UpdateWebhook = channel.unary_unary(
|
||||
'/noteflow.NoteFlowService/UpdateWebhook',
|
||||
request_serializer=noteflow__pb2.UpdateWebhookRequest.SerializeToString,
|
||||
response_deserializer=noteflow__pb2.WebhookConfigProto.FromString,
|
||||
_registered_method=True)
|
||||
self.DeleteWebhook = channel.unary_unary(
|
||||
'/noteflow.NoteFlowService/DeleteWebhook',
|
||||
request_serializer=noteflow__pb2.DeleteWebhookRequest.SerializeToString,
|
||||
response_deserializer=noteflow__pb2.DeleteWebhookResponse.FromString,
|
||||
_registered_method=True)
|
||||
self.GetWebhookDeliveries = channel.unary_unary(
|
||||
'/noteflow.NoteFlowService/GetWebhookDeliveries',
|
||||
request_serializer=noteflow__pb2.GetWebhookDeliveriesRequest.SerializeToString,
|
||||
response_deserializer=noteflow__pb2.GetWebhookDeliveriesResponse.FromString,
|
||||
_registered_method=True)
|
||||
|
||||
|
||||
class NoteFlowServiceServicer(object):
|
||||
@@ -332,6 +357,37 @@ class NoteFlowServiceServicer(object):
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
def RegisterWebhook(self, request, context):
|
||||
"""Webhook management (Sprint 6)
|
||||
"""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
def ListWebhooks(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
def UpdateWebhook(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
def DeleteWebhook(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
def GetWebhookDeliveries(self, request, context):
|
||||
"""Missing associated documentation comment in .proto file."""
|
||||
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
|
||||
context.set_details('Method not implemented!')
|
||||
raise NotImplementedError('Method not implemented!')
|
||||
|
||||
|
||||
def add_NoteFlowServiceServicer_to_server(servicer, server):
|
||||
rpc_method_handlers = {
|
||||
@@ -460,6 +516,31 @@ def add_NoteFlowServiceServicer_to_server(servicer, server):
|
||||
request_deserializer=noteflow__pb2.DisconnectOAuthRequest.FromString,
|
||||
response_serializer=noteflow__pb2.DisconnectOAuthResponse.SerializeToString,
|
||||
),
|
||||
'RegisterWebhook': grpc.unary_unary_rpc_method_handler(
|
||||
servicer.RegisterWebhook,
|
||||
request_deserializer=noteflow__pb2.RegisterWebhookRequest.FromString,
|
||||
response_serializer=noteflow__pb2.WebhookConfigProto.SerializeToString,
|
||||
),
|
||||
'ListWebhooks': grpc.unary_unary_rpc_method_handler(
|
||||
servicer.ListWebhooks,
|
||||
request_deserializer=noteflow__pb2.ListWebhooksRequest.FromString,
|
||||
response_serializer=noteflow__pb2.ListWebhooksResponse.SerializeToString,
|
||||
),
|
||||
'UpdateWebhook': grpc.unary_unary_rpc_method_handler(
|
||||
servicer.UpdateWebhook,
|
||||
request_deserializer=noteflow__pb2.UpdateWebhookRequest.FromString,
|
||||
response_serializer=noteflow__pb2.WebhookConfigProto.SerializeToString,
|
||||
),
|
||||
'DeleteWebhook': grpc.unary_unary_rpc_method_handler(
|
||||
servicer.DeleteWebhook,
|
||||
request_deserializer=noteflow__pb2.DeleteWebhookRequest.FromString,
|
||||
response_serializer=noteflow__pb2.DeleteWebhookResponse.SerializeToString,
|
||||
),
|
||||
'GetWebhookDeliveries': grpc.unary_unary_rpc_method_handler(
|
||||
servicer.GetWebhookDeliveries,
|
||||
request_deserializer=noteflow__pb2.GetWebhookDeliveriesRequest.FromString,
|
||||
response_serializer=noteflow__pb2.GetWebhookDeliveriesResponse.SerializeToString,
|
||||
),
|
||||
}
|
||||
generic_handler = grpc.method_handlers_generic_handler(
|
||||
'noteflow.NoteFlowService', rpc_method_handlers)
|
||||
@@ -1149,3 +1230,138 @@ class NoteFlowService(object):
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@staticmethod
|
||||
def RegisterWebhook(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/noteflow.NoteFlowService/RegisterWebhook',
|
||||
noteflow__pb2.RegisterWebhookRequest.SerializeToString,
|
||||
noteflow__pb2.WebhookConfigProto.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@staticmethod
|
||||
def ListWebhooks(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/noteflow.NoteFlowService/ListWebhooks',
|
||||
noteflow__pb2.ListWebhooksRequest.SerializeToString,
|
||||
noteflow__pb2.ListWebhooksResponse.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@staticmethod
|
||||
def UpdateWebhook(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/noteflow.NoteFlowService/UpdateWebhook',
|
||||
noteflow__pb2.UpdateWebhookRequest.SerializeToString,
|
||||
noteflow__pb2.WebhookConfigProto.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@staticmethod
|
||||
def DeleteWebhook(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/noteflow.NoteFlowService/DeleteWebhook',
|
||||
noteflow__pb2.DeleteWebhookRequest.SerializeToString,
|
||||
noteflow__pb2.DeleteWebhookResponse.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@staticmethod
|
||||
def GetWebhookDeliveries(request,
|
||||
target,
|
||||
options=(),
|
||||
channel_credentials=None,
|
||||
call_credentials=None,
|
||||
insecure=False,
|
||||
compression=None,
|
||||
wait_for_ready=None,
|
||||
timeout=None,
|
||||
metadata=None):
|
||||
return grpc.experimental.unary_unary(
|
||||
request,
|
||||
target,
|
||||
'/noteflow.NoteFlowService/GetWebhookDeliveries',
|
||||
noteflow__pb2.GetWebhookDeliveriesRequest.SerializeToString,
|
||||
noteflow__pb2.GetWebhookDeliveriesResponse.FromString,
|
||||
options,
|
||||
channel_credentials,
|
||||
insecure,
|
||||
call_credentials,
|
||||
compression,
|
||||
wait_for_ready,
|
||||
timeout,
|
||||
metadata,
|
||||
_registered_method=True)
|
||||
|
||||
@@ -15,6 +15,7 @@ from pydantic import ValidationError
|
||||
from noteflow.application.services import RecoveryService
|
||||
from noteflow.application.services.ner_service import NerService
|
||||
from noteflow.application.services.summarization_service import SummarizationService
|
||||
from noteflow.application.services.webhook_service import WebhookService
|
||||
from noteflow.config.settings import Settings, get_settings
|
||||
from noteflow.infrastructure.asr import FasterWhisperEngine
|
||||
from noteflow.infrastructure.asr.engine import VALID_MODEL_SIZES
|
||||
@@ -26,6 +27,7 @@ from noteflow.infrastructure.persistence.database import (
|
||||
)
|
||||
from noteflow.infrastructure.persistence.unit_of_work import SqlAlchemyUnitOfWork
|
||||
from noteflow.infrastructure.summarization import create_summarization_service
|
||||
from noteflow.infrastructure.webhooks import WebhookExecutor
|
||||
|
||||
from ._config import (
|
||||
DEFAULT_MODEL,
|
||||
@@ -67,6 +69,7 @@ class NoteFlowServer:
|
||||
diarization_engine: DiarizationEngine | None = None,
|
||||
diarization_refinement_enabled: bool = True,
|
||||
ner_service: NerService | None = None,
|
||||
webhook_service: WebhookService | None = None,
|
||||
) -> None:
|
||||
"""Initialize the server.
|
||||
|
||||
@@ -80,6 +83,7 @@ class NoteFlowServer:
|
||||
diarization_engine: Optional diarization engine for speaker identification.
|
||||
diarization_refinement_enabled: Whether to allow diarization refinement RPCs.
|
||||
ner_service: Optional NER service for entity extraction.
|
||||
webhook_service: Optional webhook service for event notifications.
|
||||
"""
|
||||
self._port = port
|
||||
self._asr_model = asr_model
|
||||
@@ -90,6 +94,7 @@ class NoteFlowServer:
|
||||
self._diarization_engine = diarization_engine
|
||||
self._diarization_refinement_enabled = diarization_refinement_enabled
|
||||
self._ner_service = ner_service
|
||||
self._webhook_service = webhook_service
|
||||
self._server: grpc.aio.Server | None = None
|
||||
self._servicer: NoteFlowServicer | None = None
|
||||
|
||||
@@ -120,7 +125,7 @@ class NoteFlowServer:
|
||||
self._summarization_service = create_summarization_service()
|
||||
logger.info("Summarization service initialized (default factory)")
|
||||
|
||||
# Create servicer with session factory, summarization, diarization, and NER
|
||||
# Create servicer with session factory, summarization, diarization, NER, and webhooks
|
||||
self._servicer = NoteFlowServicer(
|
||||
asr_engine=asr_engine,
|
||||
session_factory=self._session_factory,
|
||||
@@ -128,6 +133,7 @@ class NoteFlowServer:
|
||||
diarization_engine=self._diarization_engine,
|
||||
diarization_refinement_enabled=self._diarization_refinement_enabled,
|
||||
ner_service=self._ner_service,
|
||||
webhook_service=self._webhook_service,
|
||||
)
|
||||
|
||||
# Create async gRPC server
|
||||
@@ -275,6 +281,29 @@ async def run_server_with_config(config: GrpcServerConfig) -> None:
|
||||
diarization_engine = DiarizationEngine(**diarization_kwargs)
|
||||
logger.info("Diarization engine initialized (models loaded on demand)")
|
||||
|
||||
# Create webhook service if enabled
|
||||
webhook_service: WebhookService | None = None
|
||||
if settings.feature_flags.webhooks_enabled:
|
||||
if session_factory:
|
||||
logger.info("Initializing webhook service...")
|
||||
webhook_executor = WebhookExecutor()
|
||||
webhook_service = WebhookService(executor=webhook_executor)
|
||||
|
||||
# Load enabled webhook configurations from database
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
webhook_configs = await uow.webhooks.get_all_enabled()
|
||||
for config_item in webhook_configs:
|
||||
webhook_service.register_webhook(config_item)
|
||||
logger.info(
|
||||
"Webhook service initialized with %d active webhooks",
|
||||
len(webhook_configs),
|
||||
)
|
||||
else:
|
||||
logger.warning(
|
||||
"Webhooks feature enabled but no database configured. "
|
||||
"Webhooks require database for configuration persistence."
|
||||
)
|
||||
|
||||
server = NoteFlowServer(
|
||||
port=config.port,
|
||||
asr_model=config.asr.model,
|
||||
@@ -285,6 +314,7 @@ async def run_server_with_config(config: GrpcServerConfig) -> None:
|
||||
diarization_engine=diarization_engine,
|
||||
diarization_refinement_enabled=diarization.refinement_enabled,
|
||||
ner_service=ner_service,
|
||||
webhook_service=webhook_service,
|
||||
)
|
||||
|
||||
# Set up graceful shutdown
|
||||
@@ -313,11 +343,19 @@ async def run_server_with_config(config: GrpcServerConfig) -> None:
|
||||
print(f"Diarization: Enabled ({diarization.device})")
|
||||
else:
|
||||
print("Diarization: Disabled")
|
||||
if webhook_service:
|
||||
webhook_count = len(webhook_service.get_webhooks())
|
||||
print(f"Webhooks: Enabled ({webhook_count} registered)")
|
||||
else:
|
||||
print("Webhooks: Disabled")
|
||||
print("Press Ctrl+C to stop\n")
|
||||
|
||||
# Wait for shutdown signal or server termination
|
||||
await shutdown_event.wait()
|
||||
finally:
|
||||
# Clean up webhook service
|
||||
if webhook_service is not None:
|
||||
await webhook_service.close()
|
||||
await server.stop()
|
||||
|
||||
|
||||
|
||||
@@ -12,7 +12,7 @@ from typing import TYPE_CHECKING, ClassVar, Final
|
||||
import grpc.aio
|
||||
import numpy as np
|
||||
|
||||
from noteflow.config.constants import DEFAULT_SAMPLE_RATE as _DEFAULT_SAMPLE_RATE
|
||||
from noteflow.config.constants import APP_DIR_NAME, DEFAULT_SAMPLE_RATE as _DEFAULT_SAMPLE_RATE
|
||||
from noteflow.domain.entities import Meeting
|
||||
from noteflow.domain.ports.unit_of_work import UnitOfWork
|
||||
from noteflow.domain.value_objects import MeetingState
|
||||
@@ -35,6 +35,7 @@ from ._mixins import (
|
||||
MeetingMixin,
|
||||
StreamingMixin,
|
||||
SummarizationMixin,
|
||||
WebhooksMixin,
|
||||
)
|
||||
from .meeting_store import MeetingStore
|
||||
from .proto import noteflow_pb2, noteflow_pb2_grpc
|
||||
@@ -46,6 +47,7 @@ if TYPE_CHECKING:
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
from noteflow.application.services.ner_service import NerService
|
||||
from noteflow.application.services.summarization_service import SummarizationService
|
||||
from noteflow.application.services.webhook_service import WebhookService
|
||||
from noteflow.infrastructure.asr import FasterWhisperEngine
|
||||
from noteflow.infrastructure.diarization import DiarizationEngine, SpeakerTurn
|
||||
|
||||
@@ -62,6 +64,7 @@ class NoteFlowServicer(
|
||||
ExportMixin,
|
||||
EntitiesMixin,
|
||||
CalendarMixin,
|
||||
WebhooksMixin,
|
||||
noteflow_pb2_grpc.NoteFlowServiceServicer,
|
||||
):
|
||||
"""Async gRPC service implementation for NoteFlow with PostgreSQL persistence."""
|
||||
@@ -83,6 +86,7 @@ class NoteFlowServicer(
|
||||
diarization_refinement_enabled: bool = True,
|
||||
ner_service: NerService | None = None,
|
||||
calendar_service: CalendarService | None = None,
|
||||
webhook_service: WebhookService | None = None,
|
||||
) -> None:
|
||||
"""Initialize the service.
|
||||
|
||||
@@ -97,6 +101,7 @@ class NoteFlowServicer(
|
||||
diarization_refinement_enabled: Whether to allow post-meeting diarization refinement.
|
||||
ner_service: Optional NER service for entity extraction.
|
||||
calendar_service: Optional calendar service for OAuth and event fetching.
|
||||
webhook_service: Optional webhook service for event notifications.
|
||||
"""
|
||||
self._asr_engine = asr_engine
|
||||
self._session_factory = session_factory
|
||||
@@ -105,6 +110,7 @@ class NoteFlowServicer(
|
||||
self._diarization_refinement_enabled = diarization_refinement_enabled
|
||||
self._ner_service = ner_service
|
||||
self._calendar_service = calendar_service
|
||||
self._webhook_service = webhook_service
|
||||
self._start_time = time.time()
|
||||
# Fallback to in-memory store if no database configured
|
||||
self._memory_store: MeetingStore | None = (
|
||||
@@ -112,7 +118,7 @@ class NoteFlowServicer(
|
||||
)
|
||||
|
||||
# Audio writing infrastructure
|
||||
self._meetings_dir = meetings_dir or (Path.home() / ".noteflow" / "meetings")
|
||||
self._meetings_dir = meetings_dir or (Path.home() / APP_DIR_NAME / "meetings")
|
||||
self._keystore = KeyringKeyStore()
|
||||
self._crypto = AesGcmCryptoBox(self._keystore)
|
||||
self._audio_writers: dict[str, MeetingAudioWriter] = {}
|
||||
|
||||
@@ -7,6 +7,11 @@ from dataclasses import dataclass, field
|
||||
from enum import Enum
|
||||
|
||||
|
||||
def _timing_error(label: str, end: float, start: float) -> str:
|
||||
"""Format timing validation error message."""
|
||||
return f"{label} end ({end}) < start ({start})"
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class WordTiming:
|
||||
"""Word-level timing information."""
|
||||
@@ -19,7 +24,7 @@ class WordTiming:
|
||||
def __post_init__(self) -> None:
|
||||
"""Validate timing data."""
|
||||
if self.end < self.start:
|
||||
raise ValueError(f"Word end ({self.end}) < start ({self.start})")
|
||||
raise ValueError(_timing_error("Word", self.end, self.start))
|
||||
if not 0.0 <= self.probability <= 1.0:
|
||||
raise ValueError(f"Probability must be 0.0-1.0, got {self.probability}")
|
||||
|
||||
@@ -40,7 +45,7 @@ class AsrResult:
|
||||
def __post_init__(self) -> None:
|
||||
"""Validate result data."""
|
||||
if self.end < self.start:
|
||||
raise ValueError(f"Segment end ({self.end}) < start ({self.start})")
|
||||
raise ValueError(_timing_error("Segment", self.end, self.start))
|
||||
|
||||
@property
|
||||
def duration(self) -> float:
|
||||
@@ -59,7 +64,7 @@ class PartialUpdate:
|
||||
def __post_init__(self) -> None:
|
||||
"""Validate partial data."""
|
||||
if self.end < self.start:
|
||||
raise ValueError(f"Partial end ({self.end}) < start ({self.start})")
|
||||
raise ValueError(_timing_error("Partial", self.end, self.start))
|
||||
|
||||
|
||||
class VadEventType(Enum):
|
||||
|
||||
@@ -83,7 +83,8 @@ class TimestampedRingBuffer:
|
||||
Returns:
|
||||
List of all TimestampedAudio chunks, ordered oldest to newest.
|
||||
"""
|
||||
return list(self._buffer)
|
||||
buffer = self._buffer
|
||||
return list(buffer)
|
||||
|
||||
def clear(self) -> None:
|
||||
"""Clear all audio from the buffer."""
|
||||
@@ -102,8 +103,9 @@ class TimestampedRingBuffer:
|
||||
|
||||
@property
|
||||
def chunk_count(self) -> int:
|
||||
"""Number of audio chunks in the buffer."""
|
||||
return len(self._buffer)
|
||||
"""Number of audio chunks currently in the buffer."""
|
||||
buffer = self._buffer
|
||||
return len(buffer)
|
||||
|
||||
def __len__(self) -> int:
|
||||
"""Return number of chunks in buffer."""
|
||||
|
||||
@@ -268,8 +268,8 @@ class MeetingAudioWriter:
|
||||
self._buffer = io.BytesIO()
|
||||
|
||||
@property
|
||||
def is_open(self) -> bool:
|
||||
"""Check if writer is currently open for writing."""
|
||||
def is_recording(self) -> bool:
|
||||
"""Check if writer is currently open for recording."""
|
||||
return self._asset_writer is not None and self._asset_writer.is_open
|
||||
|
||||
@property
|
||||
|
||||
@@ -10,6 +10,13 @@ from datetime import UTC, datetime, timedelta
|
||||
|
||||
import httpx
|
||||
|
||||
from noteflow.config.constants import (
|
||||
DEFAULT_MEETING_TITLE,
|
||||
ERR_API_PREFIX,
|
||||
ERR_TOKEN_EXPIRED,
|
||||
HTTP_AUTHORIZATION,
|
||||
HTTP_BEARER_PREFIX,
|
||||
)
|
||||
from noteflow.domain.ports.calendar import CalendarEventInfo, CalendarPort
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -61,18 +68,18 @@ class GoogleCalendarAdapter(CalendarPort):
|
||||
"orderBy": "startTime",
|
||||
}
|
||||
|
||||
headers = {"Authorization": f"Bearer {access_token}"}
|
||||
headers = {HTTP_AUTHORIZATION: f"{HTTP_BEARER_PREFIX}{access_token}"}
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, params=params, headers=headers)
|
||||
|
||||
if response.status_code == 401:
|
||||
raise GoogleCalendarError("Access token expired or invalid")
|
||||
raise GoogleCalendarError(ERR_TOKEN_EXPIRED)
|
||||
|
||||
if response.status_code != 200:
|
||||
error_msg = response.text
|
||||
logger.error("Google Calendar API error: %s", error_msg)
|
||||
raise GoogleCalendarError(f"API error: {error_msg}")
|
||||
raise GoogleCalendarError(f"{ERR_API_PREFIX}{error_msg}")
|
||||
|
||||
data = response.json()
|
||||
items = data.get("items", [])
|
||||
@@ -91,18 +98,18 @@ class GoogleCalendarAdapter(CalendarPort):
|
||||
Raises:
|
||||
GoogleCalendarError: If API call fails.
|
||||
"""
|
||||
headers = {"Authorization": f"Bearer {access_token}"}
|
||||
headers = {HTTP_AUTHORIZATION: f"{HTTP_BEARER_PREFIX}{access_token}"}
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(self.USERINFO_API_URL, headers=headers)
|
||||
|
||||
if response.status_code == 401:
|
||||
raise GoogleCalendarError("Access token expired or invalid")
|
||||
raise GoogleCalendarError(ERR_TOKEN_EXPIRED)
|
||||
|
||||
if response.status_code != 200:
|
||||
error_msg = response.text
|
||||
logger.error("Google userinfo API error: %s", error_msg)
|
||||
raise GoogleCalendarError(f"API error: {error_msg}")
|
||||
raise GoogleCalendarError(f"{ERR_API_PREFIX}{error_msg}")
|
||||
|
||||
data = response.json()
|
||||
email = data.get("email")
|
||||
@@ -114,7 +121,7 @@ class GoogleCalendarAdapter(CalendarPort):
|
||||
def _parse_event(self, item: dict[str, object]) -> CalendarEventInfo:
|
||||
"""Parse Google Calendar event into CalendarEventInfo."""
|
||||
event_id = str(item.get("id", ""))
|
||||
title = str(item.get("summary", "Untitled"))
|
||||
title = str(item.get("summary", DEFAULT_MEETING_TITLE))
|
||||
|
||||
# Parse start/end times
|
||||
start_data = item.get("start", {})
|
||||
|
||||
@@ -16,6 +16,7 @@ from urllib.parse import urlencode
|
||||
|
||||
import httpx
|
||||
|
||||
from noteflow.config.constants import ERR_TOKEN_REFRESH_PREFIX
|
||||
from noteflow.domain.ports.calendar import OAuthPort
|
||||
from noteflow.domain.value_objects import OAuthProvider, OAuthState, OAuthTokens
|
||||
|
||||
@@ -151,7 +152,7 @@ class OAuthManager(OAuthPort):
|
||||
if oauth_state is None:
|
||||
raise OAuthError("Invalid or expired state token")
|
||||
|
||||
if oauth_state.is_expired():
|
||||
if oauth_state.is_state_expired():
|
||||
raise OAuthError("State token has expired")
|
||||
|
||||
if oauth_state.provider != provider:
|
||||
@@ -210,7 +211,7 @@ class OAuthManager(OAuthPort):
|
||||
provider.value,
|
||||
error_detail,
|
||||
)
|
||||
raise OAuthError(f"Token refresh failed: {error_detail}")
|
||||
raise OAuthError(f"{ERR_TOKEN_REFRESH_PREFIX}{error_detail}")
|
||||
|
||||
token_data = response.json()
|
||||
tokens = self._parse_token_response(token_data, refresh_token)
|
||||
@@ -418,7 +419,7 @@ class OAuthManager(OAuthPort):
|
||||
expired_keys = [
|
||||
key
|
||||
for key, state in self._pending_states.items()
|
||||
if state.is_expired() or now > state.expires_at
|
||||
if state.is_state_expired() or now > state.expires_at
|
||||
]
|
||||
for key in expired_keys:
|
||||
del self._pending_states[key]
|
||||
|
||||
@@ -10,6 +10,13 @@ from datetime import UTC, datetime, timedelta
|
||||
|
||||
import httpx
|
||||
|
||||
from noteflow.config.constants import (
|
||||
DEFAULT_MEETING_TITLE,
|
||||
ERR_API_PREFIX,
|
||||
ERR_TOKEN_EXPIRED,
|
||||
HTTP_AUTHORIZATION,
|
||||
HTTP_BEARER_PREFIX,
|
||||
)
|
||||
from noteflow.domain.ports.calendar import CalendarEventInfo, CalendarPort
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -64,7 +71,7 @@ class OutlookCalendarAdapter(CalendarPort):
|
||||
}
|
||||
|
||||
headers = {
|
||||
"Authorization": f"Bearer {access_token}",
|
||||
HTTP_AUTHORIZATION: f"{HTTP_BEARER_PREFIX}{access_token}",
|
||||
"Prefer": 'outlook.timezone="UTC"',
|
||||
}
|
||||
|
||||
@@ -72,12 +79,12 @@ class OutlookCalendarAdapter(CalendarPort):
|
||||
response = await client.get(url, params=params, headers=headers)
|
||||
|
||||
if response.status_code == 401:
|
||||
raise OutlookCalendarError("Access token expired or invalid")
|
||||
raise OutlookCalendarError(ERR_TOKEN_EXPIRED)
|
||||
|
||||
if response.status_code != 200:
|
||||
error_msg = response.text
|
||||
logger.error("Microsoft Graph API error: %s", error_msg)
|
||||
raise OutlookCalendarError(f"API error: {error_msg}")
|
||||
raise OutlookCalendarError(f"{ERR_API_PREFIX}{error_msg}")
|
||||
|
||||
data = response.json()
|
||||
items = data.get("value", [])
|
||||
@@ -98,18 +105,18 @@ class OutlookCalendarAdapter(CalendarPort):
|
||||
"""
|
||||
url = f"{self.GRAPH_API_BASE}/me"
|
||||
params = {"$select": "mail,userPrincipalName"}
|
||||
headers = {"Authorization": f"Bearer {access_token}"}
|
||||
headers = {HTTP_AUTHORIZATION: f"{HTTP_BEARER_PREFIX}{access_token}"}
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, params=params, headers=headers)
|
||||
|
||||
if response.status_code == 401:
|
||||
raise OutlookCalendarError("Access token expired or invalid")
|
||||
raise OutlookCalendarError(ERR_TOKEN_EXPIRED)
|
||||
|
||||
if response.status_code != 200:
|
||||
error_msg = response.text
|
||||
logger.error("Microsoft Graph API error: %s", error_msg)
|
||||
raise OutlookCalendarError(f"API error: {error_msg}")
|
||||
raise OutlookCalendarError(f"{ERR_API_PREFIX}{error_msg}")
|
||||
|
||||
data = response.json()
|
||||
# Prefer mail, fall back to userPrincipalName
|
||||
@@ -122,7 +129,7 @@ class OutlookCalendarAdapter(CalendarPort):
|
||||
def _parse_event(self, item: dict[str, object]) -> CalendarEventInfo:
|
||||
"""Parse Microsoft Graph event into CalendarEventInfo."""
|
||||
event_id = str(item.get("id", ""))
|
||||
title = str(item.get("subject", "Untitled"))
|
||||
title = str(item.get("subject", DEFAULT_MEETING_TITLE))
|
||||
|
||||
# Parse start/end times
|
||||
start_data = item.get("start", {})
|
||||
|
||||
@@ -3,9 +3,11 @@
|
||||
from noteflow.infrastructure.converters.asr_converters import AsrConverter
|
||||
from noteflow.infrastructure.converters.calendar_converters import CalendarEventConverter
|
||||
from noteflow.infrastructure.converters.orm_converters import OrmConverter
|
||||
from noteflow.infrastructure.converters.webhook_converters import WebhookConverter
|
||||
|
||||
__all__ = [
|
||||
"AsrConverter",
|
||||
"CalendarEventConverter",
|
||||
"OrmConverter",
|
||||
"WebhookConverter",
|
||||
]
|
||||
|
||||
119
src/noteflow/infrastructure/converters/webhook_converters.py
Normal file
119
src/noteflow/infrastructure/converters/webhook_converters.py
Normal file
@@ -0,0 +1,119 @@
|
||||
"""Webhook ORM to domain entity converters."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from noteflow.domain.webhooks import (
|
||||
WebhookConfig,
|
||||
WebhookDelivery,
|
||||
WebhookEventType,
|
||||
)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from noteflow.infrastructure.persistence.models.integrations.webhook import (
|
||||
WebhookConfigModel,
|
||||
WebhookDeliveryModel,
|
||||
)
|
||||
|
||||
|
||||
class WebhookConverter:
|
||||
"""Convert between webhook domain objects and ORM models."""
|
||||
|
||||
@staticmethod
|
||||
def config_to_domain(model: WebhookConfigModel) -> WebhookConfig:
|
||||
"""Convert ORM WebhookConfigModel to domain WebhookConfig.
|
||||
|
||||
Args:
|
||||
model: SQLAlchemy WebhookConfigModel instance.
|
||||
|
||||
Returns:
|
||||
Domain WebhookConfig entity.
|
||||
"""
|
||||
return WebhookConfig(
|
||||
id=model.id,
|
||||
workspace_id=model.workspace_id,
|
||||
name=model.name,
|
||||
url=model.url,
|
||||
events=frozenset(WebhookEventType(e) for e in model.events),
|
||||
secret=model.secret,
|
||||
enabled=model.enabled,
|
||||
timeout_ms=model.timeout_ms,
|
||||
max_retries=model.max_retries,
|
||||
created_at=model.created_at,
|
||||
updated_at=model.updated_at,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def config_to_orm_kwargs(entity: WebhookConfig) -> dict[str, object]:
|
||||
"""Convert domain WebhookConfig to ORM model kwargs.
|
||||
|
||||
Returns a dict of kwargs rather than instantiating WebhookConfigModel
|
||||
directly to avoid circular imports and allow the repository to
|
||||
handle ORM construction.
|
||||
|
||||
Args:
|
||||
entity: Domain WebhookConfig.
|
||||
|
||||
Returns:
|
||||
Kwargs dict for WebhookConfigModel construction.
|
||||
"""
|
||||
return {
|
||||
"id": entity.id,
|
||||
"workspace_id": entity.workspace_id,
|
||||
"name": entity.name,
|
||||
"url": entity.url,
|
||||
"events": [e.value for e in entity.events],
|
||||
"secret": entity.secret,
|
||||
"enabled": entity.enabled,
|
||||
"timeout_ms": entity.timeout_ms,
|
||||
"max_retries": entity.max_retries,
|
||||
"created_at": entity.created_at,
|
||||
"updated_at": entity.updated_at,
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def delivery_to_domain(model: WebhookDeliveryModel) -> WebhookDelivery:
|
||||
"""Convert ORM WebhookDeliveryModel to domain WebhookDelivery.
|
||||
|
||||
Args:
|
||||
model: SQLAlchemy WebhookDeliveryModel instance.
|
||||
|
||||
Returns:
|
||||
Domain WebhookDelivery entity.
|
||||
"""
|
||||
return WebhookDelivery(
|
||||
id=model.id,
|
||||
webhook_id=model.webhook_id,
|
||||
event_type=WebhookEventType(model.event_type),
|
||||
payload=dict(model.payload),
|
||||
status_code=model.status_code,
|
||||
response_body=model.response_body,
|
||||
error_message=model.error_message,
|
||||
attempt_count=model.attempt_count,
|
||||
duration_ms=model.duration_ms,
|
||||
delivered_at=model.delivered_at,
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
def delivery_to_orm_kwargs(entity: WebhookDelivery) -> dict[str, object]:
|
||||
"""Convert domain WebhookDelivery to ORM model kwargs.
|
||||
|
||||
Args:
|
||||
entity: Domain WebhookDelivery.
|
||||
|
||||
Returns:
|
||||
Kwargs dict for WebhookDeliveryModel construction.
|
||||
"""
|
||||
return {
|
||||
"id": entity.id,
|
||||
"webhook_id": entity.webhook_id,
|
||||
"event_type": entity.event_type.value,
|
||||
"payload": entity.payload,
|
||||
"status_code": entity.status_code,
|
||||
"response_body": entity.response_body,
|
||||
"error_message": entity.error_message,
|
||||
"attempt_count": entity.attempt_count,
|
||||
"duration_ms": entity.duration_ms,
|
||||
"delivered_at": entity.delivered_at,
|
||||
}
|
||||
@@ -13,7 +13,7 @@ import os
|
||||
import warnings
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from noteflow.config.constants import DEFAULT_SAMPLE_RATE
|
||||
from noteflow.config.constants import DEFAULT_SAMPLE_RATE, ERR_HF_TOKEN_REQUIRED
|
||||
from noteflow.infrastructure.diarization.dto import SpeakerTurn
|
||||
from noteflow.infrastructure.diarization.session import DiarizationSession
|
||||
|
||||
@@ -107,7 +107,7 @@ class DiarizationEngine:
|
||||
return
|
||||
|
||||
if not self._hf_token:
|
||||
raise ValueError("HuggingFace token required for pyannote models")
|
||||
raise ValueError(ERR_HF_TOKEN_REQUIRED)
|
||||
|
||||
device = self._resolve_device()
|
||||
|
||||
@@ -159,7 +159,7 @@ class DiarizationEngine:
|
||||
return
|
||||
|
||||
if not self._hf_token:
|
||||
raise ValueError("HuggingFace token required for pyannote models")
|
||||
raise ValueError(ERR_HF_TOKEN_REQUIRED)
|
||||
|
||||
device = self._resolve_device()
|
||||
logger.info("Loading shared streaming diarization models on %s...", device)
|
||||
@@ -230,7 +230,7 @@ class DiarizationEngine:
|
||||
return
|
||||
|
||||
if not self._hf_token:
|
||||
raise ValueError("HuggingFace token required for pyannote models")
|
||||
raise ValueError(ERR_HF_TOKEN_REQUIRED)
|
||||
|
||||
device = self._resolve_device()
|
||||
|
||||
|
||||
@@ -162,7 +162,8 @@ class DiarizationSession:
|
||||
@property
|
||||
def turns(self) -> list[SpeakerTurn]:
|
||||
"""All accumulated speaker turns for this session."""
|
||||
return list(self._turns)
|
||||
turns = self._turns
|
||||
return list(turns)
|
||||
|
||||
@property
|
||||
def is_closed(self) -> bool:
|
||||
|
||||
@@ -1,8 +1,23 @@
|
||||
"""Shared formatting utilities for export modules."""
|
||||
|
||||
import html as html_module
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
def escape_html(text: str) -> str:
|
||||
"""Escape HTML special characters.
|
||||
|
||||
Args:
|
||||
text: Raw text to escape.
|
||||
|
||||
Returns:
|
||||
HTML-safe text with special characters converted to entities.
|
||||
"""
|
||||
if not text:
|
||||
return text
|
||||
return html_module.escape(text)
|
||||
|
||||
|
||||
def format_timestamp(seconds: float) -> str:
|
||||
"""Format seconds as MM:SS or HH:MM:SS.
|
||||
|
||||
|
||||
@@ -5,11 +5,15 @@ Export meeting transcripts to HTML format.
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import html
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from noteflow.infrastructure.export._formatting import format_datetime, format_timestamp
|
||||
from noteflow.config.constants import EXPORT_EXT_HTML, EXPORT_FORMAT_HTML
|
||||
from noteflow.infrastructure.export._formatting import (
|
||||
escape_html,
|
||||
format_datetime,
|
||||
format_timestamp,
|
||||
)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Sequence
|
||||
@@ -18,18 +22,6 @@ if TYPE_CHECKING:
|
||||
from noteflow.domain.entities.segment import Segment
|
||||
|
||||
|
||||
def _escape(text: str) -> str:
|
||||
"""Escape HTML special characters.
|
||||
|
||||
Args:
|
||||
text: Raw text to escape.
|
||||
|
||||
Returns:
|
||||
HTML-safe text.
|
||||
"""
|
||||
return html.escape(text)
|
||||
|
||||
|
||||
# HTML template with embedded CSS for print-friendly output
|
||||
_HTML_TEMPLATE = """<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
@@ -84,12 +76,12 @@ class HtmlExporter:
|
||||
@property
|
||||
def format_name(self) -> str:
|
||||
"""Human-readable format name."""
|
||||
return "HTML"
|
||||
return EXPORT_FORMAT_HTML
|
||||
|
||||
@property
|
||||
def file_extension(self) -> str:
|
||||
"""File extension for HTML."""
|
||||
return ".html"
|
||||
return EXPORT_EXT_HTML
|
||||
|
||||
def export(
|
||||
self,
|
||||
@@ -106,21 +98,21 @@ class HtmlExporter:
|
||||
HTML-formatted transcript string.
|
||||
"""
|
||||
content_parts: list[str] = [
|
||||
f"<h1>{_escape(meeting.title)}</h1>",
|
||||
f"<h1>{escape_html(meeting.title)}</h1>",
|
||||
'<div class="metadata">',
|
||||
"<dl>",
|
||||
]
|
||||
|
||||
content_parts.append(
|
||||
f"<dt>Date:</dt><dd>{_escape(format_datetime(meeting.created_at))}</dd>"
|
||||
f"<dt>Date:</dt><dd>{escape_html(format_datetime(meeting.created_at))}</dd>"
|
||||
)
|
||||
if meeting.started_at:
|
||||
content_parts.append(
|
||||
f"<dt>Started:</dt><dd>{_escape(format_datetime(meeting.started_at))}</dd>"
|
||||
f"<dt>Started:</dt><dd>{escape_html(format_datetime(meeting.started_at))}</dd>"
|
||||
)
|
||||
if meeting.ended_at:
|
||||
content_parts.append(
|
||||
f"<dt>Ended:</dt><dd>{_escape(format_datetime(meeting.ended_at))}</dd>"
|
||||
f"<dt>Ended:</dt><dd>{escape_html(format_datetime(meeting.ended_at))}</dd>"
|
||||
)
|
||||
content_parts.append(
|
||||
f"<dt>Duration:</dt><dd>{format_timestamp(meeting.duration_seconds)}</dd>"
|
||||
@@ -138,19 +130,19 @@ class HtmlExporter:
|
||||
timestamp = format_timestamp(segment.start_time)
|
||||
content_parts.append('<div class="segment">')
|
||||
content_parts.append(f'<span class="timestamp">[{timestamp}]</span>')
|
||||
content_parts.extend((f"<span>{_escape(segment.text)}</span>", "</div>"))
|
||||
content_parts.extend((f"<span>{escape_html(segment.text)}</span>", "</div>"))
|
||||
content_parts.append("</div>")
|
||||
|
||||
# Summary section (if available)
|
||||
if meeting.summary:
|
||||
content_parts.extend(('<div class="summary">', "<h2>Summary</h2>"))
|
||||
if meeting.summary.executive_summary:
|
||||
content_parts.append(f"<p>{_escape(meeting.summary.executive_summary)}</p>")
|
||||
content_parts.append(f"<p>{escape_html(meeting.summary.executive_summary)}</p>")
|
||||
|
||||
if meeting.summary.key_points:
|
||||
content_parts.extend(("<h3>Key Points</h3>", '<ul class="key-points">'))
|
||||
content_parts.extend(
|
||||
f"<li>{_escape(point.text)}</li>" for point in meeting.summary.key_points
|
||||
f"<li>{escape_html(point.text)}</li>" for point in meeting.summary.key_points
|
||||
)
|
||||
content_parts.append("</ul>")
|
||||
|
||||
@@ -158,11 +150,11 @@ class HtmlExporter:
|
||||
content_parts.extend(("<h3>Action Items</h3>", '<ul class="action-items">'))
|
||||
for item in meeting.summary.action_items:
|
||||
assignee = (
|
||||
f' <span class="assignee">@{_escape(item.assignee)}</span>'
|
||||
f' <span class="assignee">@{escape_html(item.assignee)}</span>'
|
||||
if item.assignee
|
||||
else ""
|
||||
)
|
||||
content_parts.append(f"<li>{_escape(item.text)}{assignee}</li>")
|
||||
content_parts.append(f"<li>{escape_html(item.text)}{assignee}</li>")
|
||||
content_parts.append("</ul>")
|
||||
|
||||
content_parts.append("</div>")
|
||||
@@ -171,9 +163,9 @@ class HtmlExporter:
|
||||
content_parts.append("<footer>")
|
||||
content_parts.extend(
|
||||
(
|
||||
f"Exported from NoteFlow on {_escape(format_datetime(datetime.now()))}",
|
||||
f"Exported from NoteFlow on {escape_html(format_datetime(datetime.now()))}",
|
||||
"</footer>",
|
||||
)
|
||||
)
|
||||
content = "\n".join(content_parts)
|
||||
return _HTML_TEMPLATE.format(title=_escape(meeting.title), content=content)
|
||||
return _HTML_TEMPLATE.format(title=escape_html(meeting.title), content=content)
|
||||
|
||||
@@ -5,10 +5,14 @@ Export meeting transcripts to PDF format.
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import html
|
||||
from typing import TYPE_CHECKING, Protocol, cast
|
||||
|
||||
from noteflow.infrastructure.export._formatting import format_datetime, format_timestamp
|
||||
from noteflow.config.constants import EXPORT_EXT_PDF, EXPORT_FORMAT_HTML
|
||||
from noteflow.infrastructure.export._formatting import (
|
||||
escape_html,
|
||||
format_datetime,
|
||||
format_timestamp,
|
||||
)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from collections.abc import Sequence
|
||||
@@ -42,18 +46,6 @@ def _get_weasy_html() -> type[_WeasyHTMLProtocol] | None:
|
||||
return cast(type[_WeasyHTMLProtocol], html_class)
|
||||
|
||||
|
||||
def _escape(text: str) -> str:
|
||||
"""Escape HTML special characters.
|
||||
|
||||
Args:
|
||||
text: Raw text to escape.
|
||||
|
||||
Returns:
|
||||
HTML-safe text.
|
||||
"""
|
||||
return html.escape(text)
|
||||
|
||||
|
||||
# PDF-optimized CSS with A4 page settings
|
||||
_PDF_CSS = """
|
||||
@page {
|
||||
@@ -164,7 +156,7 @@ class PdfExporter:
|
||||
@property
|
||||
def file_extension(self) -> str:
|
||||
"""File extension for PDF."""
|
||||
return ".pdf"
|
||||
return EXPORT_EXT_PDF
|
||||
|
||||
def export(
|
||||
self,
|
||||
@@ -203,7 +195,7 @@ class PdfExporter:
|
||||
Returns:
|
||||
HTML string for PDF conversion.
|
||||
"""
|
||||
title = _escape(meeting.title)
|
||||
title = escape_html(meeting.title)
|
||||
date = format_datetime(meeting.created_at)
|
||||
duration = format_timestamp(meeting.duration_seconds)
|
||||
|
||||
@@ -220,7 +212,7 @@ class PdfExporter:
|
||||
<body>
|
||||
<h1>{title}</h1>
|
||||
<div class="metadata">
|
||||
<strong>Date:</strong> {_escape(date)} |
|
||||
<strong>Date:</strong> {escape_html(date)} |
|
||||
<strong>Duration:</strong> {duration} |
|
||||
<strong>Segments:</strong> {len(segments)}
|
||||
</div>
|
||||
@@ -242,9 +234,9 @@ class PdfExporter:
|
||||
parts: list[str] = []
|
||||
|
||||
for segment in segments:
|
||||
speaker = _escape(segment.speaker_id or "Unknown")
|
||||
speaker = escape_html(segment.speaker_id or "Unknown")
|
||||
timestamp = format_timestamp(segment.start_time)
|
||||
text = _escape(segment.text)
|
||||
text = escape_html(segment.text)
|
||||
|
||||
parts.append(f"""
|
||||
<div class="segment">
|
||||
@@ -268,12 +260,12 @@ class PdfExporter:
|
||||
if not summary:
|
||||
return ""
|
||||
|
||||
exec_summary = _escape(summary.executive_summary)
|
||||
exec_summary = escape_html(summary.executive_summary)
|
||||
|
||||
key_points_html = ""
|
||||
if summary.key_points:
|
||||
items = "\n".join(
|
||||
f"<li>{_escape(kp.text)}</li>" for kp in summary.key_points
|
||||
f"<li>{escape_html(kp.text)}</li>" for kp in summary.key_points
|
||||
)
|
||||
key_points_html = f"""
|
||||
<h3>Key Points</h3>
|
||||
@@ -284,7 +276,7 @@ class PdfExporter:
|
||||
action_items_html = ""
|
||||
if summary.action_items:
|
||||
items = "\n".join(
|
||||
f'<div class="action-item">{_escape(ai.text)}</div>'
|
||||
f'<div class="action-item">{escape_html(ai.text)}</div>'
|
||||
for ai in summary.action_items
|
||||
)
|
||||
action_items_html = f"""
|
||||
|
||||
@@ -10,6 +10,12 @@ import logging
|
||||
from functools import partial
|
||||
from typing import TYPE_CHECKING, Final
|
||||
|
||||
from noteflow.config.constants import (
|
||||
SPACY_MODEL_LG,
|
||||
SPACY_MODEL_MD,
|
||||
SPACY_MODEL_SM,
|
||||
SPACY_MODEL_TRF,
|
||||
)
|
||||
from noteflow.domain.entities.named_entity import EntityCategory, NamedEntity
|
||||
|
||||
if TYPE_CHECKING:
|
||||
@@ -56,10 +62,10 @@ _SKIP_ENTITY_TYPES: Final[frozenset[str]] = frozenset({
|
||||
|
||||
# Valid model names
|
||||
VALID_SPACY_MODELS: Final[tuple[str, ...]] = (
|
||||
"en_core_web_sm",
|
||||
"en_core_web_md",
|
||||
"en_core_web_lg",
|
||||
"en_core_web_trf",
|
||||
SPACY_MODEL_SM,
|
||||
SPACY_MODEL_MD,
|
||||
SPACY_MODEL_LG,
|
||||
SPACY_MODEL_TRF,
|
||||
)
|
||||
|
||||
|
||||
@@ -73,12 +79,12 @@ class NerEngine:
|
||||
long transcripts while maintaining segment tracking.
|
||||
"""
|
||||
|
||||
def __init__(self, model_name: str = "en_core_web_trf") -> None:
|
||||
def __init__(self, model_name: str = SPACY_MODEL_TRF) -> None:
|
||||
"""Initialize NER engine.
|
||||
|
||||
Args:
|
||||
model_name: spaCy model to use. Defaults to en_core_web_trf
|
||||
(transformer-based for higher accuracy).
|
||||
model_name: spaCy model to use. Defaults to transformer model
|
||||
for higher accuracy.
|
||||
"""
|
||||
if model_name not in VALID_SPACY_MODELS:
|
||||
raise ValueError(
|
||||
|
||||
@@ -10,9 +10,18 @@ from collections.abc import Sequence
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from noteflow.config.constants import ERR_SERVER_RESTARTED
|
||||
from noteflow.domain.entities import Meeting, Segment, Summary
|
||||
from noteflow.domain.value_objects import MeetingState
|
||||
|
||||
# Error messages for unsupported operations in memory mode
|
||||
_ERR_ANNOTATIONS_DB = "Annotations require database persistence"
|
||||
_ERR_DIARIZATION_DB = "Diarization jobs require database persistence"
|
||||
_ERR_PREFERENCES_DB = "Preferences require database persistence"
|
||||
_ERR_NER_ENTITIES_DB = "NER entities require database persistence"
|
||||
_ERR_INTEGRATIONS_DB = "Integrations require database persistence"
|
||||
_ERR_WEBHOOKS_DB = "Webhooks require database persistence"
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from uuid import UUID
|
||||
|
||||
@@ -20,6 +29,7 @@ if TYPE_CHECKING:
|
||||
from noteflow.domain.entities.integration import Integration
|
||||
from noteflow.domain.entities.named_entity import NamedEntity
|
||||
from noteflow.domain.value_objects import AnnotationId, MeetingId
|
||||
from noteflow.domain.webhooks import WebhookConfig, WebhookDelivery
|
||||
from noteflow.grpc.meeting_store import MeetingStore
|
||||
from noteflow.infrastructure.persistence.repositories import (
|
||||
DiarizationJob,
|
||||
@@ -31,11 +41,7 @@ class MemoryMeetingRepository:
|
||||
"""Meeting repository backed by MeetingStore."""
|
||||
|
||||
def __init__(self, store: MeetingStore) -> None:
|
||||
"""Initialize with meeting store.
|
||||
|
||||
Args:
|
||||
store: In-memory meeting store.
|
||||
"""
|
||||
"""Initialize with meeting store."""
|
||||
self._store = store
|
||||
|
||||
async def create(self, meeting: Meeting) -> Meeting:
|
||||
@@ -55,13 +61,10 @@ class MemoryMeetingRepository:
|
||||
return self._store.delete(str(meeting_id))
|
||||
|
||||
async def list_all(
|
||||
self,
|
||||
states: list[MeetingState] | None = None,
|
||||
limit: int = 100,
|
||||
offset: int = 0,
|
||||
sort_desc: bool = True,
|
||||
self, states: list[MeetingState] | None = None, limit: int = 100,
|
||||
offset: int = 0, sort_desc: bool = True,
|
||||
) -> tuple[Sequence[Meeting], int]:
|
||||
"""List meetings with optional filtering."""
|
||||
"""List meetings via in-memory store with optional state filtering."""
|
||||
return self._store.list_all(states, limit, offset, sort_desc)
|
||||
|
||||
async def count_by_state(self, state: MeetingState) -> int:
|
||||
@@ -86,59 +89,39 @@ class MemorySegmentRepository:
|
||||
return segment
|
||||
|
||||
async def add_batch(
|
||||
self,
|
||||
meeting_id: MeetingId,
|
||||
segments: Sequence[Segment],
|
||||
self, meeting_id: MeetingId, segments: Sequence[Segment],
|
||||
) -> Sequence[Segment]:
|
||||
"""Add multiple segments to a meeting in batch."""
|
||||
"""Add segments to meeting via in-memory store (no DB batching)."""
|
||||
for segment in segments:
|
||||
self._store.add_segment(str(meeting_id), segment)
|
||||
return segments
|
||||
|
||||
async def get_by_meeting(
|
||||
self,
|
||||
meeting_id: MeetingId,
|
||||
include_words: bool = True,
|
||||
self, meeting_id: MeetingId, include_words: bool = True,
|
||||
) -> Sequence[Segment]:
|
||||
"""Get all segments for a meeting."""
|
||||
"""Fetch segments from in-memory store (include_words ignored)."""
|
||||
return self._store.fetch_segments(str(meeting_id))
|
||||
|
||||
async def search_semantic(
|
||||
self,
|
||||
query_embedding: list[float],
|
||||
limit: int = 10,
|
||||
self, query_embedding: list[float], limit: int = 10,
|
||||
meeting_id: MeetingId | None = None,
|
||||
) -> Sequence[tuple[Segment, float]]:
|
||||
"""Semantic search not supported in memory mode."""
|
||||
"""Returns empty - semantic search requires database with pgvector."""
|
||||
return []
|
||||
|
||||
async def update_embedding(
|
||||
self,
|
||||
segment_db_id: int,
|
||||
embedding: list[float],
|
||||
self, segment_db_id: int, embedding: list[float],
|
||||
) -> None:
|
||||
"""Embeddings not supported in memory mode."""
|
||||
"""No-op: embeddings require database with pgvector extension."""
|
||||
|
||||
async def update_speaker(
|
||||
self,
|
||||
segment_db_id: int,
|
||||
speaker_id: str | None,
|
||||
speaker_confidence: float,
|
||||
self, segment_db_id: int, speaker_id: str | None, speaker_confidence: float,
|
||||
) -> None:
|
||||
"""Update speaker for segment - not applicable in memory mode.
|
||||
|
||||
In memory mode, segments are updated directly on the entity.
|
||||
This method exists for interface compatibility.
|
||||
"""
|
||||
"""No-op: in-memory segments update speaker directly on entity."""
|
||||
|
||||
async def compute_next_segment_id(self, meeting_id: MeetingId) -> int:
|
||||
"""Compute next available segment ID for a meeting.
|
||||
|
||||
Returns:
|
||||
Next sequential segment ID (0 if meeting has no segments).
|
||||
"""
|
||||
meeting_id_str = str(meeting_id)
|
||||
return self._store.compute_next_segment_id(meeting_id_str)
|
||||
"""Compute next available segment ID (0 if meeting has no segments)."""
|
||||
return self._store.compute_next_segment_id(str(meeting_id))
|
||||
|
||||
|
||||
class MemorySummaryRepository:
|
||||
@@ -170,32 +153,29 @@ class UnsupportedAnnotationRepository:
|
||||
|
||||
async def add(self, annotation: Annotation) -> Annotation:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Annotations require database persistence")
|
||||
raise NotImplementedError(_ERR_ANNOTATIONS_DB)
|
||||
|
||||
async def get(self, annotation_id: AnnotationId) -> Annotation | None:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Annotations require database persistence")
|
||||
raise NotImplementedError(_ERR_ANNOTATIONS_DB)
|
||||
|
||||
async def get_by_meeting(self, meeting_id: MeetingId) -> Sequence[Annotation]:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Annotations require database persistence")
|
||||
raise NotImplementedError(_ERR_ANNOTATIONS_DB)
|
||||
|
||||
async def get_by_time_range(
|
||||
self,
|
||||
meeting_id: MeetingId,
|
||||
start_time: float,
|
||||
end_time: float,
|
||||
self, meeting_id: MeetingId, start_time: float, end_time: float,
|
||||
) -> Sequence[Annotation]:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Annotations require database persistence")
|
||||
raise NotImplementedError(_ERR_ANNOTATIONS_DB)
|
||||
|
||||
async def update(self, annotation: Annotation) -> Annotation:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Annotations require database persistence")
|
||||
raise NotImplementedError(_ERR_ANNOTATIONS_DB)
|
||||
|
||||
async def delete(self, annotation_id: AnnotationId) -> bool:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Annotations require database persistence")
|
||||
raise NotImplementedError(_ERR_ANNOTATIONS_DB)
|
||||
|
||||
|
||||
class UnsupportedDiarizationJobRepository:
|
||||
@@ -206,32 +186,27 @@ class UnsupportedDiarizationJobRepository:
|
||||
|
||||
async def create(self, job: DiarizationJob) -> DiarizationJob:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def get(self, job_id: str) -> DiarizationJob | None:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def update_status(
|
||||
self,
|
||||
job_id: str,
|
||||
status: int,
|
||||
*,
|
||||
segments_updated: int | None = None,
|
||||
speaker_ids: list[str] | None = None,
|
||||
error_message: str | None = None,
|
||||
self, job_id: str, status: int, *, segments_updated: int | None = None,
|
||||
speaker_ids: list[str] | None = None, error_message: str | None = None,
|
||||
started_at: datetime | None = None,
|
||||
) -> bool:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def get_active_for_meeting(self, meeting_id: str) -> DiarizationJob | None:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def prune_completed(self, ttl_seconds: float) -> int:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def add_streaming_turns(
|
||||
self,
|
||||
@@ -239,19 +214,19 @@ class UnsupportedDiarizationJobRepository:
|
||||
turns: Sequence[StreamingTurn],
|
||||
) -> int:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def get_streaming_turns(self, meeting_id: str) -> list[StreamingTurn]:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def clear_streaming_turns(self, meeting_id: str) -> int:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
async def mark_running_as_failed(self, error_message: str = "Server restarted") -> int:
|
||||
async def mark_running_as_failed(self, error_message: str = ERR_SERVER_RESTARTED) -> int:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Diarization jobs require database persistence")
|
||||
raise NotImplementedError(_ERR_DIARIZATION_DB)
|
||||
|
||||
|
||||
class UnsupportedPreferencesRepository:
|
||||
@@ -262,15 +237,15 @@ class UnsupportedPreferencesRepository:
|
||||
|
||||
async def get(self, key: str) -> object | None:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Preferences require database persistence")
|
||||
raise NotImplementedError(_ERR_PREFERENCES_DB)
|
||||
|
||||
async def set(self, key: str, value: object) -> None:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Preferences require database persistence")
|
||||
raise NotImplementedError(_ERR_PREFERENCES_DB)
|
||||
|
||||
async def delete(self, key: str) -> bool:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("Preferences require database persistence")
|
||||
raise NotImplementedError(_ERR_PREFERENCES_DB)
|
||||
|
||||
|
||||
class UnsupportedEntityRepository:
|
||||
@@ -281,31 +256,31 @@ class UnsupportedEntityRepository:
|
||||
|
||||
async def save(self, entity: NamedEntity) -> NamedEntity:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("NER entities require database persistence")
|
||||
raise NotImplementedError(_ERR_NER_ENTITIES_DB)
|
||||
|
||||
async def save_batch(self, entities: Sequence[NamedEntity]) -> Sequence[NamedEntity]:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("NER entities require database persistence")
|
||||
raise NotImplementedError(_ERR_NER_ENTITIES_DB)
|
||||
|
||||
async def get(self, entity_id: UUID) -> NamedEntity | None:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("NER entities require database persistence")
|
||||
raise NotImplementedError(_ERR_NER_ENTITIES_DB)
|
||||
|
||||
async def get_by_meeting(self, meeting_id: MeetingId) -> Sequence[NamedEntity]:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("NER entities require database persistence")
|
||||
raise NotImplementedError(_ERR_NER_ENTITIES_DB)
|
||||
|
||||
async def delete_by_meeting(self, meeting_id: MeetingId) -> int:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("NER entities require database persistence")
|
||||
raise NotImplementedError(_ERR_NER_ENTITIES_DB)
|
||||
|
||||
async def update_pinned(self, entity_id: UUID, is_pinned: bool) -> bool:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("NER entities require database persistence")
|
||||
raise NotImplementedError(_ERR_NER_ENTITIES_DB)
|
||||
|
||||
async def exists_for_meeting(self, meeting_id: MeetingId) -> bool:
|
||||
"""Not supported in memory mode."""
|
||||
raise NotImplementedError("NER entities require database persistence")
|
||||
raise NotImplementedError(_ERR_NER_ENTITIES_DB)
|
||||
|
||||
|
||||
class InMemoryIntegrationRepository:
|
||||
@@ -325,9 +300,7 @@ class InMemoryIntegrationRepository:
|
||||
return self._integrations.get(integration_id)
|
||||
|
||||
async def get_by_provider(
|
||||
self,
|
||||
provider: str,
|
||||
integration_type: str | None = None,
|
||||
self, provider: str, integration_type: str | None = None,
|
||||
) -> Integration | None:
|
||||
"""Retrieve an integration by provider name."""
|
||||
for integration in self._integrations.values():
|
||||
@@ -377,3 +350,111 @@ class InMemoryIntegrationRepository:
|
||||
i for i in self._integrations.values()
|
||||
if i.type.value == integration_type
|
||||
]
|
||||
|
||||
|
||||
class InMemoryWebhookRepository:
|
||||
"""In-memory webhook repository for testing.
|
||||
|
||||
Provides a functional webhook repository backed by dictionaries
|
||||
for use in tests and memory mode.
|
||||
"""
|
||||
|
||||
def __init__(self) -> None:
|
||||
"""Initialize with empty storage."""
|
||||
self._configs: dict[UUID, WebhookConfig] = {}
|
||||
self._deliveries: dict[UUID, list[WebhookDelivery]] = {}
|
||||
|
||||
async def get_all_enabled(
|
||||
self, workspace_id: UUID | None = None,
|
||||
) -> Sequence[WebhookConfig]:
|
||||
"""Retrieve all enabled webhook configurations (in-memory)."""
|
||||
configs = [c for c in self._configs.values() if c.enabled]
|
||||
if workspace_id is not None:
|
||||
configs = [c for c in configs if c.workspace_id == workspace_id]
|
||||
return sorted(configs, key=lambda c: c.created_at, reverse=True)
|
||||
|
||||
async def get_all(
|
||||
self, workspace_id: UUID | None = None,
|
||||
) -> Sequence[WebhookConfig]:
|
||||
"""Retrieve all webhook configurations regardless of enabled status."""
|
||||
configs = list(self._configs.values())
|
||||
if workspace_id is not None:
|
||||
configs = [c for c in configs if c.workspace_id == workspace_id]
|
||||
return sorted(configs, key=lambda c: c.created_at, reverse=True)
|
||||
|
||||
async def get_by_id(self, webhook_id: UUID) -> WebhookConfig | None:
|
||||
"""Retrieve a webhook configuration by ID.
|
||||
|
||||
Args:
|
||||
webhook_id: Unique webhook identifier.
|
||||
|
||||
Returns:
|
||||
Webhook configuration or None if not found.
|
||||
"""
|
||||
return self._configs.get(webhook_id)
|
||||
|
||||
async def create(self, config: WebhookConfig) -> WebhookConfig:
|
||||
"""Persist a new webhook configuration.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration to create.
|
||||
|
||||
Returns:
|
||||
Created webhook.
|
||||
"""
|
||||
self._configs[config.id] = config
|
||||
return config
|
||||
|
||||
async def update(self, config: WebhookConfig) -> WebhookConfig:
|
||||
"""Update an existing webhook configuration.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration with updated values.
|
||||
|
||||
Returns:
|
||||
Updated webhook configuration.
|
||||
|
||||
Raises:
|
||||
ValueError: If webhook does not exist.
|
||||
"""
|
||||
if config.id not in self._configs:
|
||||
msg = f"Webhook {config.id} not found"
|
||||
raise ValueError(msg)
|
||||
self._configs[config.id] = config
|
||||
return config
|
||||
|
||||
async def delete(self, webhook_id: UUID) -> bool:
|
||||
"""Delete a webhook configuration.
|
||||
|
||||
Args:
|
||||
webhook_id: Unique webhook identifier.
|
||||
|
||||
Returns:
|
||||
True if deleted, False if not found.
|
||||
"""
|
||||
if webhook_id not in self._configs:
|
||||
return False
|
||||
del self._configs[webhook_id]
|
||||
self._deliveries.pop(webhook_id, None)
|
||||
return True
|
||||
|
||||
async def add_delivery(self, delivery: WebhookDelivery) -> WebhookDelivery:
|
||||
"""Record a webhook delivery attempt.
|
||||
|
||||
Args:
|
||||
delivery: Delivery record to persist.
|
||||
|
||||
Returns:
|
||||
Persisted delivery record.
|
||||
"""
|
||||
if delivery.webhook_id not in self._deliveries:
|
||||
self._deliveries[delivery.webhook_id] = []
|
||||
self._deliveries[delivery.webhook_id].insert(0, delivery)
|
||||
return delivery
|
||||
|
||||
async def get_deliveries(
|
||||
self, webhook_id: UUID, limit: int = 50,
|
||||
) -> Sequence[WebhookDelivery]:
|
||||
"""Retrieve delivery history for a webhook (in-memory)."""
|
||||
deliveries = self._deliveries.get(webhook_id, [])
|
||||
return deliveries[:limit]
|
||||
|
||||
@@ -18,10 +18,12 @@ from noteflow.domain.ports.repositories import (
|
||||
PreferencesRepository,
|
||||
SegmentRepository,
|
||||
SummaryRepository,
|
||||
WebhookRepository,
|
||||
)
|
||||
|
||||
from .repositories import (
|
||||
InMemoryIntegrationRepository,
|
||||
InMemoryWebhookRepository,
|
||||
MemoryMeetingRepository,
|
||||
MemorySegmentRepository,
|
||||
MemorySummaryRepository,
|
||||
@@ -66,6 +68,7 @@ class MemoryUnitOfWork:
|
||||
self._preferences = UnsupportedPreferencesRepository()
|
||||
self._entities = UnsupportedEntityRepository()
|
||||
self._integrations = InMemoryIntegrationRepository()
|
||||
self._webhooks = InMemoryWebhookRepository()
|
||||
|
||||
# Core repositories
|
||||
@property
|
||||
@@ -109,6 +112,11 @@ class MemoryUnitOfWork:
|
||||
"""Get integrations repository."""
|
||||
return self._integrations
|
||||
|
||||
@property
|
||||
def webhooks(self) -> WebhookRepository:
|
||||
"""Get webhooks repository for event notifications."""
|
||||
return self._webhooks
|
||||
|
||||
# Feature flags - limited in memory mode
|
||||
@property
|
||||
def supports_annotations(self) -> bool:
|
||||
@@ -135,6 +143,11 @@ class MemoryUnitOfWork:
|
||||
"""Integration persistence supported in memory mode."""
|
||||
return True
|
||||
|
||||
@property
|
||||
def supports_webhooks(self) -> bool:
|
||||
"""Webhook persistence supported in memory mode."""
|
||||
return True
|
||||
|
||||
async def __aenter__(self) -> Self:
|
||||
"""Enter the unit of work context.
|
||||
|
||||
|
||||
@@ -13,6 +13,7 @@ from .meeting_repo import SqlAlchemyMeetingRepository
|
||||
from .preferences_repo import SqlAlchemyPreferencesRepository
|
||||
from .segment_repo import SqlAlchemySegmentRepository
|
||||
from .summary_repo import SqlAlchemySummaryRepository
|
||||
from .webhook_repo import SqlAlchemyWebhookRepository
|
||||
|
||||
__all__ = [
|
||||
"JOB_STATUS_CANCELLED",
|
||||
@@ -25,5 +26,6 @@ __all__ = [
|
||||
"SqlAlchemyPreferencesRepository",
|
||||
"SqlAlchemySegmentRepository",
|
||||
"SqlAlchemySummaryRepository",
|
||||
"SqlAlchemyWebhookRepository",
|
||||
"StreamingTurn",
|
||||
]
|
||||
|
||||
@@ -8,6 +8,7 @@ from uuid import UUID
|
||||
|
||||
from sqlalchemy import delete, select, update
|
||||
|
||||
from noteflow.config.constants import ERR_SERVER_RESTARTED
|
||||
from noteflow.infrastructure.persistence.models import (
|
||||
DiarizationJobModel,
|
||||
StreamingDiarizationTurnModel,
|
||||
@@ -193,7 +194,7 @@ class SqlAlchemyDiarizationJobRepository(BaseRepository):
|
||||
model = await self._execute_scalar(stmt)
|
||||
return self._to_domain(model) if model else None
|
||||
|
||||
async def mark_running_as_failed(self, error_message: str = "Server restarted") -> int:
|
||||
async def mark_running_as_failed(self, error_message: str = ERR_SERVER_RESTARTED) -> int:
|
||||
"""Mark all QUEUED or RUNNING jobs as FAILED.
|
||||
|
||||
Used during crash recovery to mark orphaned jobs.
|
||||
|
||||
@@ -0,0 +1,186 @@
|
||||
"""SQLAlchemy repository for Webhook entities."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from collections.abc import Sequence
|
||||
from typing import TYPE_CHECKING
|
||||
from uuid import UUID
|
||||
|
||||
from sqlalchemy import select
|
||||
|
||||
from noteflow.domain.webhooks import WebhookConfig, WebhookDelivery
|
||||
from noteflow.infrastructure.converters.webhook_converters import WebhookConverter
|
||||
from noteflow.infrastructure.persistence.models.integrations.webhook import (
|
||||
WebhookConfigModel,
|
||||
WebhookDeliveryModel,
|
||||
)
|
||||
from noteflow.infrastructure.persistence.repositories._base import BaseRepository
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
|
||||
class SqlAlchemyWebhookRepository(BaseRepository):
|
||||
"""SQLAlchemy implementation of WebhookRepository.
|
||||
|
||||
Manages webhook configurations and delivery history for event notifications.
|
||||
"""
|
||||
|
||||
def __init__(self, session: AsyncSession) -> None:
|
||||
"""Initialize repository with database session.
|
||||
|
||||
Args:
|
||||
session: SQLAlchemy async session.
|
||||
"""
|
||||
super().__init__(session)
|
||||
|
||||
async def get_all_enabled(
|
||||
self,
|
||||
workspace_id: UUID | None = None,
|
||||
) -> Sequence[WebhookConfig]:
|
||||
"""Retrieve all enabled webhook configurations.
|
||||
|
||||
Args:
|
||||
workspace_id: Optional filter by workspace.
|
||||
|
||||
Returns:
|
||||
Sequence of enabled webhook configurations.
|
||||
"""
|
||||
stmt = select(WebhookConfigModel).where(WebhookConfigModel.enabled.is_(True))
|
||||
if workspace_id is not None:
|
||||
stmt = stmt.where(WebhookConfigModel.workspace_id == workspace_id)
|
||||
stmt = stmt.order_by(WebhookConfigModel.created_at.desc())
|
||||
|
||||
models = await self._execute_scalars(stmt)
|
||||
return [WebhookConverter.config_to_domain(m) for m in models]
|
||||
|
||||
async def get_all(
|
||||
self,
|
||||
workspace_id: UUID | None = None,
|
||||
) -> Sequence[WebhookConfig]:
|
||||
"""Retrieve all webhook configurations regardless of enabled status.
|
||||
|
||||
Args:
|
||||
workspace_id: Optional filter by workspace.
|
||||
|
||||
Returns:
|
||||
Sequence of all webhook configurations.
|
||||
"""
|
||||
stmt = select(WebhookConfigModel)
|
||||
if workspace_id is not None:
|
||||
stmt = stmt.where(WebhookConfigModel.workspace_id == workspace_id)
|
||||
stmt = stmt.order_by(WebhookConfigModel.created_at.desc())
|
||||
|
||||
models = await self._execute_scalars(stmt)
|
||||
return [WebhookConverter.config_to_domain(m) for m in models]
|
||||
|
||||
async def get_by_id(self, webhook_id: UUID) -> WebhookConfig | None:
|
||||
"""Retrieve a webhook configuration by ID.
|
||||
|
||||
Args:
|
||||
webhook_id: Unique webhook identifier.
|
||||
|
||||
Returns:
|
||||
Webhook configuration or None if not found.
|
||||
"""
|
||||
stmt = select(WebhookConfigModel).where(WebhookConfigModel.id == webhook_id)
|
||||
model = await self._execute_scalar(stmt)
|
||||
return WebhookConverter.config_to_domain(model) if model else None
|
||||
|
||||
async def create(self, config: WebhookConfig) -> WebhookConfig:
|
||||
"""Persist a new webhook configuration.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration to create.
|
||||
|
||||
Returns:
|
||||
Created webhook with any generated fields populated.
|
||||
"""
|
||||
kwargs = WebhookConverter.config_to_orm_kwargs(config)
|
||||
model = WebhookConfigModel(**kwargs)
|
||||
await self._add_and_flush(model)
|
||||
return WebhookConverter.config_to_domain(model)
|
||||
|
||||
async def update(self, config: WebhookConfig) -> WebhookConfig:
|
||||
"""Update an existing webhook configuration.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration with updated values.
|
||||
|
||||
Returns:
|
||||
Updated webhook configuration.
|
||||
|
||||
Raises:
|
||||
ValueError: If webhook does not exist.
|
||||
"""
|
||||
stmt = select(WebhookConfigModel).where(WebhookConfigModel.id == config.id)
|
||||
model = await self._execute_scalar(stmt)
|
||||
if not model:
|
||||
msg = f"Webhook {config.id} not found"
|
||||
raise ValueError(msg)
|
||||
|
||||
# Update fields
|
||||
model.name = config.name
|
||||
model.url = config.url
|
||||
model.events = [e.value for e in config.events]
|
||||
model.secret = config.secret
|
||||
model.enabled = config.enabled
|
||||
model.timeout_ms = config.timeout_ms
|
||||
model.max_retries = config.max_retries
|
||||
|
||||
await self._session.flush()
|
||||
return WebhookConverter.config_to_domain(model)
|
||||
|
||||
async def delete(self, webhook_id: UUID) -> bool:
|
||||
"""Delete a webhook configuration.
|
||||
|
||||
Args:
|
||||
webhook_id: Unique webhook identifier.
|
||||
|
||||
Returns:
|
||||
True if deleted, False if not found.
|
||||
"""
|
||||
stmt = select(WebhookConfigModel).where(WebhookConfigModel.id == webhook_id)
|
||||
model = await self._execute_scalar(stmt)
|
||||
if not model:
|
||||
return False
|
||||
|
||||
await self._delete_and_flush(model)
|
||||
return True
|
||||
|
||||
async def add_delivery(self, delivery: WebhookDelivery) -> WebhookDelivery:
|
||||
"""Record a webhook delivery attempt.
|
||||
|
||||
Args:
|
||||
delivery: Delivery record to persist.
|
||||
|
||||
Returns:
|
||||
Persisted delivery record.
|
||||
"""
|
||||
kwargs = WebhookConverter.delivery_to_orm_kwargs(delivery)
|
||||
model = WebhookDeliveryModel(**kwargs)
|
||||
await self._add_and_flush(model)
|
||||
return WebhookConverter.delivery_to_domain(model)
|
||||
|
||||
async def get_deliveries(
|
||||
self,
|
||||
webhook_id: UUID,
|
||||
limit: int = 50,
|
||||
) -> Sequence[WebhookDelivery]:
|
||||
"""Retrieve delivery history for a webhook.
|
||||
|
||||
Args:
|
||||
webhook_id: Webhook to get deliveries for.
|
||||
limit: Maximum number of records.
|
||||
|
||||
Returns:
|
||||
Sequence of delivery records, newest first.
|
||||
"""
|
||||
stmt = (
|
||||
select(WebhookDeliveryModel)
|
||||
.where(WebhookDeliveryModel.webhook_id == webhook_id)
|
||||
.order_by(WebhookDeliveryModel.delivered_at.desc())
|
||||
.limit(limit)
|
||||
)
|
||||
models = await self._execute_scalars(stmt)
|
||||
return [WebhookConverter.delivery_to_domain(m) for m in models]
|
||||
@@ -22,9 +22,39 @@ from .repositories import (
|
||||
SqlAlchemyPreferencesRepository,
|
||||
SqlAlchemySegmentRepository,
|
||||
SqlAlchemySummaryRepository,
|
||||
SqlAlchemyWebhookRepository,
|
||||
)
|
||||
|
||||
|
||||
def create_uow_from_settings(settings: Settings) -> SqlAlchemyUnitOfWork:
|
||||
"""Create a unit of work from application settings.
|
||||
|
||||
Builds an async engine and session factory using configured database
|
||||
settings (URL, pool size, echo), then returns a new unit of work
|
||||
instance bound to that factory.
|
||||
"""
|
||||
engine = create_async_engine(settings)
|
||||
session_factory = get_async_session_factory(engine)
|
||||
return SqlAlchemyUnitOfWork(session_factory)
|
||||
|
||||
|
||||
def create_uow_factory(settings: Settings) -> Callable[[], SqlAlchemyUnitOfWork]:
|
||||
"""Create a reusable factory that yields fresh UoW instances.
|
||||
|
||||
The factory reuses a shared async session factory (and engine) while
|
||||
returning a new `SqlAlchemyUnitOfWork` object each time. Useful when
|
||||
callers need independent UoW instances for sequential operations
|
||||
(e.g., retention cleanup) to avoid re-entrancy issues.
|
||||
"""
|
||||
engine = create_async_engine(settings)
|
||||
session_factory = get_async_session_factory(engine)
|
||||
|
||||
def _factory() -> SqlAlchemyUnitOfWork:
|
||||
return SqlAlchemyUnitOfWork(session_factory)
|
||||
|
||||
return _factory
|
||||
|
||||
|
||||
class SqlAlchemyUnitOfWork:
|
||||
"""SQLAlchemy implementation of Unit of Work.
|
||||
|
||||
@@ -39,11 +69,7 @@ class SqlAlchemyUnitOfWork:
|
||||
"""
|
||||
|
||||
def __init__(self, session_factory: async_sessionmaker[AsyncSession]) -> None:
|
||||
"""Initialize unit of work with session factory.
|
||||
|
||||
Args:
|
||||
session_factory: Factory for creating async sessions.
|
||||
"""
|
||||
"""Initialize unit of work with session factory."""
|
||||
self._session_factory = session_factory
|
||||
self._session: AsyncSession | None = None
|
||||
self._annotations_repo: SqlAlchemyAnnotationRepository | None = None
|
||||
@@ -54,39 +80,7 @@ class SqlAlchemyUnitOfWork:
|
||||
self._preferences_repo: SqlAlchemyPreferencesRepository | None = None
|
||||
self._segments_repo: SqlAlchemySegmentRepository | None = None
|
||||
self._summaries_repo: SqlAlchemySummaryRepository | None = None
|
||||
|
||||
# --- Constructors -------------------------------------------------
|
||||
|
||||
@classmethod
|
||||
def from_settings(cls, settings: Settings) -> SqlAlchemyUnitOfWork:
|
||||
"""Create a unit of work from application settings.
|
||||
|
||||
Builds an async engine and session factory using configured database
|
||||
settings (URL, pool size, echo), then returns a new unit of work
|
||||
instance bound to that factory.
|
||||
"""
|
||||
|
||||
engine = create_async_engine(settings)
|
||||
session_factory = get_async_session_factory(engine)
|
||||
return cls(session_factory)
|
||||
|
||||
@classmethod
|
||||
def factory_from_settings(cls, settings: Settings) -> Callable[[], SqlAlchemyUnitOfWork]:
|
||||
"""Create a reusable factory that yields fresh UoW instances.
|
||||
|
||||
The factory reuses a shared async session factory (and engine) while
|
||||
returning a new `SqlAlchemyUnitOfWork` object each time. Useful when
|
||||
callers need independent UoW instances for sequential operations
|
||||
(e.g., retention cleanup) to avoid re-entrancy issues.
|
||||
"""
|
||||
|
||||
engine = create_async_engine(settings)
|
||||
session_factory = get_async_session_factory(engine)
|
||||
|
||||
def _factory() -> SqlAlchemyUnitOfWork:
|
||||
return cls(session_factory)
|
||||
|
||||
return _factory
|
||||
self._webhooks_repo: SqlAlchemyWebhookRepository | None = None
|
||||
|
||||
@property
|
||||
def annotations(self) -> SqlAlchemyAnnotationRepository:
|
||||
@@ -144,6 +138,13 @@ class SqlAlchemyUnitOfWork:
|
||||
raise RuntimeError("UnitOfWork not in context")
|
||||
return self._summaries_repo
|
||||
|
||||
@property
|
||||
def webhooks(self) -> SqlAlchemyWebhookRepository:
|
||||
"""Get webhooks repository for event notifications."""
|
||||
if self._webhooks_repo is None:
|
||||
raise RuntimeError("UnitOfWork not in context")
|
||||
return self._webhooks_repo
|
||||
|
||||
# Feature flags - all True for database-backed implementation
|
||||
@property
|
||||
def supports_annotations(self) -> bool:
|
||||
@@ -170,6 +171,11 @@ class SqlAlchemyUnitOfWork:
|
||||
"""OAuth integration persistence is fully supported with database."""
|
||||
return True
|
||||
|
||||
@property
|
||||
def supports_webhooks(self) -> bool:
|
||||
"""Webhook persistence is fully supported with database."""
|
||||
return True
|
||||
|
||||
async def __aenter__(self) -> Self:
|
||||
"""Enter the unit of work context.
|
||||
|
||||
@@ -187,6 +193,7 @@ class SqlAlchemyUnitOfWork:
|
||||
self._preferences_repo = SqlAlchemyPreferencesRepository(self._session)
|
||||
self._segments_repo = SqlAlchemySegmentRepository(self._session)
|
||||
self._summaries_repo = SqlAlchemySummaryRepository(self._session)
|
||||
self._webhooks_repo = SqlAlchemyWebhookRepository(self._session)
|
||||
return self
|
||||
|
||||
async def __aexit__(
|
||||
@@ -220,6 +227,7 @@ class SqlAlchemyUnitOfWork:
|
||||
self._preferences_repo = None
|
||||
self._segments_repo = None
|
||||
self._summaries_repo = None
|
||||
self._webhooks_repo = None
|
||||
|
||||
async def commit(self) -> None:
|
||||
"""Commit the current transaction."""
|
||||
|
||||
@@ -14,6 +14,8 @@ from typing import Final
|
||||
|
||||
import keyring
|
||||
|
||||
from noteflow.config.constants import APP_DIR_NAME
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Constants
|
||||
@@ -21,7 +23,7 @@ KEY_SIZE: Final[int] = 32 # 256-bit key
|
||||
SERVICE_NAME: Final[str] = "noteflow"
|
||||
KEY_NAME: Final[str] = "master_key"
|
||||
ENV_VAR_NAME: Final[str] = "NOTEFLOW_MASTER_KEY"
|
||||
DEFAULT_KEY_FILE: Final[Path] = Path.home() / ".noteflow" / ".master_key"
|
||||
DEFAULT_KEY_FILE: Final[Path] = Path.home() / APP_DIR_NAME / ".master_key"
|
||||
|
||||
|
||||
def _decode_and_validate_key(encoded: str, source_name: str) -> bytes:
|
||||
|
||||
@@ -28,6 +28,27 @@ if TYPE_CHECKING:
|
||||
import openai
|
||||
|
||||
|
||||
def _get_llm_settings() -> tuple[str, str, float, float]:
|
||||
"""Get LLM settings with fallback defaults.
|
||||
|
||||
Returns:
|
||||
Tuple of (openai_model, anthropic_model, temperature, timeout_seconds).
|
||||
"""
|
||||
try:
|
||||
from noteflow.config.settings import get_settings
|
||||
|
||||
settings = get_settings()
|
||||
return (
|
||||
settings.llm_default_openai_model,
|
||||
settings.llm_default_anthropic_model,
|
||||
settings.llm_temperature,
|
||||
settings.llm_timeout_seconds,
|
||||
)
|
||||
except Exception:
|
||||
# Fallback for testing without full settings
|
||||
return ("gpt-4o-mini", "claude-3-haiku-20240307", 0.3, 60.0)
|
||||
|
||||
|
||||
class CloudBackend(Enum):
|
||||
"""Supported cloud LLM backends."""
|
||||
|
||||
@@ -46,21 +67,28 @@ class CloudSummarizer:
|
||||
backend: CloudBackend = CloudBackend.OPENAI,
|
||||
api_key: str | None = None,
|
||||
model: str | None = None,
|
||||
timeout_seconds: float = 60.0,
|
||||
timeout_seconds: float | None = None,
|
||||
base_url: str | None = None,
|
||||
temperature: float | None = None,
|
||||
) -> None:
|
||||
"""Initialize cloud summarizer.
|
||||
|
||||
Args:
|
||||
backend: Cloud provider backend (OpenAI or Anthropic).
|
||||
api_key: API key (defaults to env var if not provided).
|
||||
model: Model name (defaults per backend if not provided).
|
||||
timeout_seconds: Request timeout in seconds.
|
||||
model: Model name (defaults per backend from settings if not provided).
|
||||
timeout_seconds: Request timeout (uses settings if not provided).
|
||||
base_url: Optional base URL (OpenAI only; defaults to OpenAI API).
|
||||
temperature: LLM temperature (uses settings if not provided).
|
||||
"""
|
||||
# Load defaults from settings
|
||||
settings = _get_llm_settings()
|
||||
openai_model, anthropic_model, default_temp, default_timeout = settings
|
||||
|
||||
self._backend = backend
|
||||
self._api_key = api_key
|
||||
self._timeout = timeout_seconds
|
||||
self._timeout = timeout_seconds if timeout_seconds is not None else default_timeout
|
||||
self._temperature = temperature if temperature is not None else default_temp
|
||||
self._client: openai.OpenAI | anthropic.Anthropic | None = None
|
||||
# Only used for OpenAI
|
||||
self._openai_base_url = (
|
||||
@@ -71,11 +99,9 @@ class CloudSummarizer:
|
||||
else None
|
||||
)
|
||||
|
||||
# Set default models per backend
|
||||
# Set default models per backend from settings
|
||||
if model is None:
|
||||
self._model = (
|
||||
"gpt-4o-mini" if backend == CloudBackend.OPENAI else "claude-3-haiku-20240307"
|
||||
)
|
||||
self._model = openai_model if backend == CloudBackend.OPENAI else anthropic_model
|
||||
else:
|
||||
self._model = model
|
||||
|
||||
@@ -228,7 +254,7 @@ class CloudSummarizer:
|
||||
{"role": "system", "content": system_prompt},
|
||||
{"role": "user", "content": user_prompt},
|
||||
],
|
||||
temperature=0.3,
|
||||
temperature=self._temperature,
|
||||
response_format={"type": "json_object"},
|
||||
)
|
||||
except TimeoutError as e:
|
||||
|
||||
@@ -26,6 +26,26 @@ if TYPE_CHECKING:
|
||||
import ollama
|
||||
|
||||
|
||||
def _get_ollama_settings() -> tuple[str, float, float]:
|
||||
"""Get Ollama settings with fallback defaults.
|
||||
|
||||
Returns:
|
||||
Tuple of (host, timeout_seconds, temperature).
|
||||
"""
|
||||
try:
|
||||
from noteflow.config.settings import get_settings
|
||||
|
||||
settings = get_settings()
|
||||
return (
|
||||
settings.ollama_host,
|
||||
settings.ollama_timeout_seconds,
|
||||
settings.llm_temperature,
|
||||
)
|
||||
except Exception:
|
||||
# Fallback for testing without full settings
|
||||
return ("http://localhost:11434", 120.0, 0.3)
|
||||
|
||||
|
||||
class OllamaSummarizer:
|
||||
"""Ollama-based local LLM summarizer.
|
||||
|
||||
@@ -37,18 +57,24 @@ class OllamaSummarizer:
|
||||
self,
|
||||
model: str | None = None,
|
||||
host: str | None = None,
|
||||
timeout_seconds: float = 120.0,
|
||||
timeout_seconds: float | None = None,
|
||||
temperature: float | None = None,
|
||||
) -> None:
|
||||
"""Initialize Ollama summarizer.
|
||||
|
||||
Args:
|
||||
model: Ollama model name (e.g., 'llama3.2', 'mistral').
|
||||
host: Ollama server URL.
|
||||
timeout_seconds: Request timeout in seconds.
|
||||
host: Ollama server URL (uses settings if not provided).
|
||||
timeout_seconds: Request timeout (uses settings if not provided).
|
||||
temperature: LLM temperature (uses settings if not provided).
|
||||
"""
|
||||
# Load defaults from settings
|
||||
default_host, default_timeout, default_temp = _get_ollama_settings()
|
||||
|
||||
self._model = model or os.environ.get("OLLAMA_MODEL", "llama3.2")
|
||||
self._host = host or os.environ.get("OLLAMA_HOST", "http://localhost:11434")
|
||||
self._timeout = timeout_seconds
|
||||
self._host = host or os.environ.get("OLLAMA_HOST") or default_host
|
||||
self._timeout = timeout_seconds if timeout_seconds is not None else default_timeout
|
||||
self._temperature = temperature if temperature is not None else default_temp
|
||||
self._client: ollama.Client | None = None
|
||||
|
||||
def _get_client(self) -> ollama.Client:
|
||||
@@ -148,7 +174,7 @@ class OllamaSummarizer:
|
||||
{"role": "system", "content": effective_system_prompt},
|
||||
{"role": "user", "content": user_prompt},
|
||||
],
|
||||
options={"temperature": 0.3},
|
||||
options={"temperature": self._temperature},
|
||||
format="json",
|
||||
)
|
||||
except TimeoutError as e:
|
||||
|
||||
7
src/noteflow/infrastructure/webhooks/__init__.py
Normal file
7
src/noteflow/infrastructure/webhooks/__init__.py
Normal file
@@ -0,0 +1,7 @@
|
||||
"""Webhook infrastructure module for event delivery."""
|
||||
|
||||
from .executor import WebhookExecutor
|
||||
|
||||
__all__ = [
|
||||
"WebhookExecutor",
|
||||
]
|
||||
270
src/noteflow/infrastructure/webhooks/executor.py
Normal file
270
src/noteflow/infrastructure/webhooks/executor.py
Normal file
@@ -0,0 +1,270 @@
|
||||
"""Webhook execution infrastructure with retry logic and HMAC signing."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import hashlib
|
||||
import hmac
|
||||
import json
|
||||
import logging
|
||||
import time
|
||||
from typing import TYPE_CHECKING, Any
|
||||
from uuid import uuid4
|
||||
|
||||
import httpx
|
||||
|
||||
from noteflow.config.settings import get_settings
|
||||
from noteflow.domain.utils.time import utc_now
|
||||
from noteflow.domain.webhooks import WebhookDelivery, WebhookEventType
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from noteflow.domain.webhooks import WebhookConfig
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _get_webhook_settings() -> tuple[float, int, float, int]:
|
||||
"""Get webhook settings with fallback defaults for testing.
|
||||
|
||||
Returns:
|
||||
Tuple of (timeout_seconds, max_retries, backoff_base, max_response_length).
|
||||
"""
|
||||
try:
|
||||
settings = get_settings()
|
||||
return (
|
||||
settings.webhook_timeout_seconds,
|
||||
settings.webhook_max_retries,
|
||||
settings.webhook_backoff_base,
|
||||
settings.webhook_max_response_length,
|
||||
)
|
||||
except Exception:
|
||||
# Fallback for testing without full settings
|
||||
return (10.0, 3, 2.0, 500)
|
||||
|
||||
|
||||
class WebhookExecutor:
|
||||
"""Execute webhooks with retry logic and HMAC signing.
|
||||
|
||||
Implements exponential backoff for failed deliveries and
|
||||
supports optional HMAC-SHA256 signature verification.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
max_retries: int | None = None,
|
||||
timeout_seconds: float | None = None,
|
||||
backoff_base: float | None = None,
|
||||
max_response_length: int | None = None,
|
||||
) -> None:
|
||||
"""Initialize webhook executor.
|
||||
|
||||
Args:
|
||||
max_retries: Maximum delivery attempts (uses settings if None).
|
||||
timeout_seconds: HTTP request timeout (uses settings if None).
|
||||
backoff_base: Exponential backoff multiplier (uses settings if None).
|
||||
max_response_length: Max response body to log (uses settings if None).
|
||||
"""
|
||||
defaults = _get_webhook_settings()
|
||||
self._timeout = timeout_seconds if timeout_seconds is not None else defaults[0]
|
||||
self._max_retries = max_retries if max_retries is not None else defaults[1]
|
||||
self._backoff_base = backoff_base if backoff_base is not None else defaults[2]
|
||||
self._max_response_length = max_response_length if max_response_length is not None else defaults[3]
|
||||
self._client: httpx.AsyncClient | None = None
|
||||
|
||||
async def _ensure_client(self) -> httpx.AsyncClient:
|
||||
"""Lazy-initialize HTTP client.
|
||||
|
||||
Returns:
|
||||
Configured async HTTP client.
|
||||
"""
|
||||
if self._client is None:
|
||||
self._client = httpx.AsyncClient(timeout=self._timeout)
|
||||
return self._client
|
||||
|
||||
async def deliver(
|
||||
self,
|
||||
config: WebhookConfig,
|
||||
event_type: WebhookEventType,
|
||||
payload: dict[str, Any],
|
||||
) -> WebhookDelivery:
|
||||
"""Deliver webhook with retries.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration.
|
||||
event_type: Type of event being delivered.
|
||||
payload: Event payload data.
|
||||
|
||||
Returns:
|
||||
Delivery record with status information.
|
||||
"""
|
||||
if not config.enabled:
|
||||
return self._create_delivery(
|
||||
config=config,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=None,
|
||||
error_message="Webhook disabled",
|
||||
attempt_count=0,
|
||||
duration_ms=None,
|
||||
)
|
||||
|
||||
if not config.subscribes_to(event_type):
|
||||
return self._create_delivery(
|
||||
config=config,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=None,
|
||||
error_message=f"Event {event_type.value} not subscribed",
|
||||
attempt_count=0,
|
||||
duration_ms=None,
|
||||
)
|
||||
|
||||
headers = self._build_headers(config, event_type, payload)
|
||||
client = await self._ensure_client()
|
||||
|
||||
max_retries = min(config.max_retries, self._max_retries)
|
||||
last_error: str | None = None
|
||||
attempt = 0
|
||||
|
||||
for attempt in range(1, max_retries + 1):
|
||||
start_time = time.monotonic()
|
||||
try:
|
||||
_logger.debug(
|
||||
"Webhook delivery attempt %d/%d to %s",
|
||||
attempt,
|
||||
max_retries,
|
||||
config.url,
|
||||
)
|
||||
|
||||
response = await client.post(
|
||||
config.url,
|
||||
json=payload,
|
||||
headers=headers,
|
||||
timeout=config.timeout_ms / 1000.0,
|
||||
)
|
||||
|
||||
duration_ms = int((time.monotonic() - start_time) * 1000)
|
||||
response_body = response.text[: self._max_response_length]
|
||||
|
||||
return self._create_delivery(
|
||||
config=config,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=response.status_code,
|
||||
response_body=response_body if not response.is_success else None,
|
||||
error_message=None if response.is_success else response_body,
|
||||
attempt_count=attempt,
|
||||
duration_ms=duration_ms,
|
||||
)
|
||||
|
||||
except httpx.TimeoutException:
|
||||
last_error = "Request timed out"
|
||||
_logger.warning(
|
||||
"Webhook timeout (attempt %d/%d): %s",
|
||||
attempt,
|
||||
max_retries,
|
||||
config.url,
|
||||
)
|
||||
|
||||
except httpx.RequestError as e:
|
||||
last_error = str(e)
|
||||
_logger.warning(
|
||||
"Webhook request error (attempt %d/%d): %s - %s",
|
||||
attempt,
|
||||
max_retries,
|
||||
config.url,
|
||||
e,
|
||||
)
|
||||
|
||||
# Exponential backoff before retry
|
||||
if attempt < max_retries:
|
||||
delay = self._backoff_base ** (attempt - 1)
|
||||
await asyncio.sleep(delay)
|
||||
|
||||
return self._create_delivery(
|
||||
config=config,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=None,
|
||||
error_message=f"Max retries exceeded: {last_error}",
|
||||
attempt_count=attempt,
|
||||
duration_ms=None,
|
||||
)
|
||||
|
||||
def _build_headers(
|
||||
self,
|
||||
config: WebhookConfig,
|
||||
event_type: WebhookEventType,
|
||||
payload: dict[str, Any],
|
||||
) -> dict[str, str]:
|
||||
"""Build HTTP headers for webhook request.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration.
|
||||
event_type: Type of event.
|
||||
payload: Event payload.
|
||||
|
||||
Returns:
|
||||
Headers dictionary including signature if secret configured.
|
||||
"""
|
||||
headers = {
|
||||
"Content-Type": "application/json",
|
||||
"X-NoteFlow-Event": event_type.value,
|
||||
"X-NoteFlow-Delivery": str(uuid4()),
|
||||
}
|
||||
|
||||
if config.secret:
|
||||
body = json.dumps(payload, separators=(",", ":"))
|
||||
signature = hmac.new(
|
||||
config.secret.encode(),
|
||||
body.encode(),
|
||||
hashlib.sha256,
|
||||
).hexdigest()
|
||||
headers["X-NoteFlow-Signature"] = f"sha256={signature}"
|
||||
|
||||
return headers
|
||||
|
||||
def _create_delivery(
|
||||
self,
|
||||
config: WebhookConfig,
|
||||
event_type: WebhookEventType,
|
||||
payload: dict[str, Any],
|
||||
status_code: int | None,
|
||||
error_message: str | None,
|
||||
attempt_count: int,
|
||||
duration_ms: int | None,
|
||||
response_body: str | None = None,
|
||||
) -> WebhookDelivery:
|
||||
"""Create a delivery record.
|
||||
|
||||
Args:
|
||||
config: Webhook configuration.
|
||||
event_type: Type of event.
|
||||
payload: Event payload.
|
||||
status_code: HTTP response status.
|
||||
error_message: Error description.
|
||||
attempt_count: Number of attempts.
|
||||
duration_ms: Request duration.
|
||||
response_body: Response body (for errors).
|
||||
|
||||
Returns:
|
||||
WebhookDelivery record.
|
||||
"""
|
||||
return WebhookDelivery(
|
||||
id=uuid4(),
|
||||
webhook_id=config.id,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=status_code,
|
||||
response_body=response_body,
|
||||
error_message=error_message,
|
||||
attempt_count=attempt_count,
|
||||
duration_ms=duration_ms,
|
||||
delivered_at=utc_now(),
|
||||
)
|
||||
|
||||
async def close(self) -> None:
|
||||
"""Close HTTP client and release resources."""
|
||||
if self._client is not None:
|
||||
await self._client.aclose()
|
||||
self._client = None
|
||||
@@ -17,19 +17,6 @@ if TYPE_CHECKING:
|
||||
from noteflow.config.settings import CalendarSettings
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def calendar_settings() -> CalendarSettings:
|
||||
"""Create test calendar settings."""
|
||||
from noteflow.config.settings import CalendarSettings
|
||||
|
||||
return CalendarSettings(
|
||||
google_client_id="test-google-client-id",
|
||||
google_client_secret="test-google-client-secret",
|
||||
outlook_client_id="test-outlook-client-id",
|
||||
outlook_client_secret="test-outlook-client-secret",
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_oauth_manager() -> MagicMock:
|
||||
"""Create mock OAuth manager."""
|
||||
@@ -87,18 +74,13 @@ def mock_outlook_adapter() -> MagicMock:
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_uow() -> MagicMock:
|
||||
"""Create mock unit of work."""
|
||||
uow = MagicMock()
|
||||
uow.__aenter__ = AsyncMock(return_value=uow)
|
||||
uow.__aexit__ = AsyncMock(return_value=None)
|
||||
uow.integrations = MagicMock()
|
||||
uow.integrations.get_by_type_and_provider = AsyncMock(return_value=None)
|
||||
uow.integrations.add = AsyncMock()
|
||||
uow.integrations.get_secrets = AsyncMock(return_value=None)
|
||||
uow.integrations.set_secrets = AsyncMock()
|
||||
uow.commit = AsyncMock()
|
||||
return uow
|
||||
def calendar_mock_uow(mock_uow: MagicMock) -> MagicMock:
|
||||
"""Configure mock_uow with calendar service specific integrations behavior."""
|
||||
mock_uow.integrations.get_by_type_and_provider = AsyncMock(return_value=None)
|
||||
mock_uow.integrations.add = AsyncMock()
|
||||
mock_uow.integrations.get_secrets = AsyncMock(return_value=None)
|
||||
mock_uow.integrations.set_secrets = AsyncMock()
|
||||
return mock_uow
|
||||
|
||||
|
||||
class TestCalendarServiceInitiateOAuth:
|
||||
@@ -111,13 +93,13 @@ class TestCalendarServiceInitiateOAuth:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""initiate_oauth should return auth URL and state."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -141,17 +123,17 @@ class TestCalendarServiceCompleteOAuth:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""complete_oauth should store tokens in integration secrets."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
mock_uow.integrations.create = AsyncMock()
|
||||
mock_uow.integrations.update = AsyncMock()
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
calendar_mock_uow.integrations.create = AsyncMock()
|
||||
calendar_mock_uow.integrations.update = AsyncMock()
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -162,8 +144,8 @@ class TestCalendarServiceCompleteOAuth:
|
||||
|
||||
assert result is True
|
||||
mock_oauth_manager.complete_auth.assert_called_once()
|
||||
mock_uow.integrations.set_secrets.assert_called_once()
|
||||
mock_uow.commit.assert_called()
|
||||
calendar_mock_uow.integrations.set_secrets.assert_called_once()
|
||||
calendar_mock_uow.commit.assert_called()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_complete_oauth_creates_integration_if_not_exists(
|
||||
@@ -172,17 +154,17 @@ class TestCalendarServiceCompleteOAuth:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""complete_oauth should create new integration if none exists."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
mock_uow.integrations.create = AsyncMock()
|
||||
mock_uow.integrations.update = AsyncMock()
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
calendar_mock_uow.integrations.create = AsyncMock()
|
||||
calendar_mock_uow.integrations.update = AsyncMock()
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -191,7 +173,7 @@ class TestCalendarServiceCompleteOAuth:
|
||||
|
||||
await service.complete_oauth("google", "auth-code", "state-123")
|
||||
|
||||
mock_uow.integrations.create.assert_called_once()
|
||||
calendar_mock_uow.integrations.create.assert_called_once()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_complete_oauth_updates_existing_integration(
|
||||
@@ -200,7 +182,7 @@ class TestCalendarServiceCompleteOAuth:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""complete_oauth should update existing integration."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
@@ -211,12 +193,12 @@ class TestCalendarServiceCompleteOAuth:
|
||||
integration_type=IntegrationType.CALENDAR,
|
||||
config={"provider": "google"},
|
||||
)
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=existing_integration)
|
||||
mock_uow.integrations.create = AsyncMock()
|
||||
mock_uow.integrations.update = AsyncMock()
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=existing_integration)
|
||||
calendar_mock_uow.integrations.create = AsyncMock()
|
||||
calendar_mock_uow.integrations.update = AsyncMock()
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -225,7 +207,7 @@ class TestCalendarServiceCompleteOAuth:
|
||||
|
||||
await service.complete_oauth("google", "auth-code", "state-123")
|
||||
|
||||
mock_uow.integrations.create.assert_not_called()
|
||||
calendar_mock_uow.integrations.create.assert_not_called()
|
||||
assert existing_integration.status == IntegrationStatus.CONNECTED
|
||||
|
||||
|
||||
@@ -239,7 +221,7 @@ class TestCalendarServiceGetConnectionStatus:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""get_connection_status should return connection info for connected provider."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
@@ -251,8 +233,8 @@ class TestCalendarServiceGetConnectionStatus:
|
||||
config={"provider": "google"},
|
||||
)
|
||||
integration.connect(provider_email="user@gmail.com")
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
mock_uow.integrations.get_secrets = AsyncMock(return_value={
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
calendar_mock_uow.integrations.get_secrets = AsyncMock(return_value={
|
||||
"access_token": "token",
|
||||
"refresh_token": "refresh",
|
||||
"token_type": "Bearer",
|
||||
@@ -261,7 +243,7 @@ class TestCalendarServiceGetConnectionStatus:
|
||||
})
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -280,15 +262,15 @@ class TestCalendarServiceGetConnectionStatus:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""get_connection_status should return disconnected when no integration."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -310,7 +292,7 @@ class TestCalendarServiceDisconnect:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""disconnect should revoke tokens and delete integration."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
@@ -322,12 +304,12 @@ class TestCalendarServiceDisconnect:
|
||||
config={"provider": "google"},
|
||||
)
|
||||
integration.connect(provider_email="user@gmail.com")
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
mock_uow.integrations.get_secrets = AsyncMock(return_value={"access_token": "token"})
|
||||
mock_uow.integrations.delete = AsyncMock()
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
calendar_mock_uow.integrations.get_secrets = AsyncMock(return_value={"access_token": "token"})
|
||||
calendar_mock_uow.integrations.delete = AsyncMock()
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -338,8 +320,8 @@ class TestCalendarServiceDisconnect:
|
||||
|
||||
assert result is True
|
||||
mock_oauth_manager.revoke_tokens.assert_called_once()
|
||||
mock_uow.integrations.delete.assert_called_once()
|
||||
mock_uow.commit.assert_called()
|
||||
calendar_mock_uow.integrations.delete.assert_called_once()
|
||||
calendar_mock_uow.commit.assert_called()
|
||||
|
||||
|
||||
class TestCalendarServiceListEvents:
|
||||
@@ -352,7 +334,7 @@ class TestCalendarServiceListEvents:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""list_calendar_events should fetch events from connected provider."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
@@ -364,18 +346,18 @@ class TestCalendarServiceListEvents:
|
||||
config={"provider": "google"},
|
||||
)
|
||||
integration.connect(provider_email="user@gmail.com")
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
mock_uow.integrations.get_secrets = AsyncMock(return_value={
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
calendar_mock_uow.integrations.get_secrets = AsyncMock(return_value={
|
||||
"access_token": "token",
|
||||
"refresh_token": "refresh",
|
||||
"token_type": "Bearer",
|
||||
"expires_at": (datetime.now(UTC) + timedelta(hours=1)).isoformat(),
|
||||
"scope": "calendar",
|
||||
})
|
||||
mock_uow.integrations.update = AsyncMock()
|
||||
calendar_mock_uow.integrations.update = AsyncMock()
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -395,7 +377,7 @@ class TestCalendarServiceListEvents:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""list_calendar_events should refresh expired token before fetching."""
|
||||
from noteflow.application.services.calendar_service import CalendarService
|
||||
@@ -407,18 +389,18 @@ class TestCalendarServiceListEvents:
|
||||
config={"provider": "google"},
|
||||
)
|
||||
integration.connect(provider_email="user@gmail.com")
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
mock_uow.integrations.get_secrets = AsyncMock(return_value={
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=integration)
|
||||
calendar_mock_uow.integrations.get_secrets = AsyncMock(return_value={
|
||||
"access_token": "expired-token",
|
||||
"refresh_token": "refresh-token",
|
||||
"token_type": "Bearer",
|
||||
"expires_at": (datetime.now(UTC) - timedelta(hours=1)).isoformat(),
|
||||
"scope": "calendar",
|
||||
})
|
||||
mock_uow.integrations.update = AsyncMock()
|
||||
calendar_mock_uow.integrations.update = AsyncMock()
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
@@ -428,7 +410,7 @@ class TestCalendarServiceListEvents:
|
||||
await service.list_calendar_events(provider="google")
|
||||
|
||||
mock_oauth_manager.refresh_tokens.assert_called_once()
|
||||
mock_uow.integrations.set_secrets.assert_called()
|
||||
calendar_mock_uow.integrations.set_secrets.assert_called()
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_list_events_raises_when_not_connected(
|
||||
@@ -437,15 +419,15 @@ class TestCalendarServiceListEvents:
|
||||
mock_oauth_manager: MagicMock,
|
||||
mock_google_adapter: MagicMock,
|
||||
mock_outlook_adapter: MagicMock,
|
||||
mock_uow: MagicMock,
|
||||
calendar_mock_uow: MagicMock,
|
||||
) -> None:
|
||||
"""list_calendar_events should raise error when provider not connected."""
|
||||
from noteflow.application.services.calendar_service import CalendarService, CalendarServiceError
|
||||
|
||||
mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
calendar_mock_uow.integrations.get_by_provider = AsyncMock(return_value=None)
|
||||
|
||||
service = CalendarService(
|
||||
uow_factory=lambda: mock_uow,
|
||||
uow_factory=lambda: calendar_mock_uow,
|
||||
settings=calendar_settings,
|
||||
oauth_manager=mock_oauth_manager,
|
||||
google_adapter=mock_google_adapter,
|
||||
|
||||
383
tests/application/test_webhook_service.py
Normal file
383
tests/application/test_webhook_service.py
Normal file
@@ -0,0 +1,383 @@
|
||||
"""Unit tests for WebhookService."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Any
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
from uuid import uuid4
|
||||
|
||||
import pytest
|
||||
|
||||
from noteflow.application.services.webhook_service import WebhookService
|
||||
from noteflow.domain.entities import Meeting, Summary
|
||||
from noteflow.domain.utils.time import utc_now
|
||||
from noteflow.domain.webhooks import WebhookConfig, WebhookDelivery, WebhookEventType
|
||||
from noteflow.infrastructure.webhooks import WebhookExecutor
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def captured_payloads() -> list[dict[str, Any]]:
|
||||
"""Store payloads passed to executor for verification."""
|
||||
return []
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_executor(captured_payloads: list[dict[str, Any]]) -> MagicMock:
|
||||
"""Create a mock executor that captures delivered payloads."""
|
||||
executor = MagicMock(spec=WebhookExecutor)
|
||||
|
||||
async def capture_delivery(
|
||||
config: WebhookConfig,
|
||||
event_type: WebhookEventType,
|
||||
payload: dict[str, Any],
|
||||
) -> WebhookDelivery:
|
||||
captured_payloads.append(payload)
|
||||
return WebhookDelivery(
|
||||
id=uuid4(),
|
||||
webhook_id=config.id,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=200,
|
||||
response_body=None,
|
||||
error_message=None,
|
||||
attempt_count=1,
|
||||
duration_ms=100,
|
||||
delivered_at=utc_now(),
|
||||
)
|
||||
|
||||
executor.deliver = AsyncMock(side_effect=capture_delivery)
|
||||
executor.close = AsyncMock()
|
||||
return executor
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def webhook_service(mock_executor: MagicMock) -> WebhookService:
|
||||
"""Create a WebhookService with mock executor."""
|
||||
return WebhookService(executor=mock_executor)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def completed_meeting() -> Meeting:
|
||||
"""Create a meeting in completed state with segments."""
|
||||
meeting = Meeting.create(title="Q4 Planning Session")
|
||||
meeting.start_recording()
|
||||
# Add some segments to the meeting
|
||||
from noteflow.domain.entities import Segment
|
||||
|
||||
for i in range(3):
|
||||
segment = Segment(
|
||||
segment_id=i,
|
||||
text=f"Segment {i} content",
|
||||
start_time=float(i * 10),
|
||||
end_time=float(i * 10 + 9),
|
||||
meeting_id=meeting.id,
|
||||
)
|
||||
meeting.add_segment(segment)
|
||||
meeting.begin_stopping()
|
||||
meeting.stop_recording()
|
||||
return meeting
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def meeting_with_summary(completed_meeting: Meeting) -> Meeting:
|
||||
"""Create a meeting with an attached summary."""
|
||||
from noteflow.domain.entities.summary import ActionItem, KeyPoint
|
||||
|
||||
summary = Summary(
|
||||
meeting_id=completed_meeting.id,
|
||||
executive_summary="This meeting covered Q4 planning topics.",
|
||||
key_points=[
|
||||
KeyPoint(text="Budget allocation"),
|
||||
KeyPoint(text="Timeline review"),
|
||||
],
|
||||
action_items=[
|
||||
ActionItem(text="Send budget proposal"),
|
||||
ActionItem(text="Schedule follow-up"),
|
||||
],
|
||||
)
|
||||
completed_meeting.summary = summary
|
||||
return completed_meeting
|
||||
|
||||
|
||||
class TestWebhookRegistration:
|
||||
"""Test webhook configuration management."""
|
||||
|
||||
def test_registered_webhooks_are_retrievable(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
webhook_config: WebhookConfig,
|
||||
) -> None:
|
||||
"""Registered webhooks appear in the webhook list."""
|
||||
webhook_service.register_webhook(webhook_config)
|
||||
|
||||
webhooks = webhook_service.get_webhooks()
|
||||
|
||||
assert len(webhooks) == 1
|
||||
assert webhooks[0].url == "https://example.com/webhook"
|
||||
assert webhooks[0].name == "Test Webhook"
|
||||
|
||||
def test_multiple_webhooks_can_be_registered(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
) -> None:
|
||||
"""Multiple webhooks can be registered simultaneously."""
|
||||
webhook1 = WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://first.example.com/hook",
|
||||
events=[WebhookEventType.MEETING_COMPLETED],
|
||||
name="First Webhook",
|
||||
)
|
||||
webhook2 = WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://second.example.com/hook",
|
||||
events=[WebhookEventType.SUMMARY_GENERATED],
|
||||
name="Second Webhook",
|
||||
)
|
||||
|
||||
webhook_service.register_webhook(webhook1)
|
||||
webhook_service.register_webhook(webhook2)
|
||||
|
||||
webhooks = webhook_service.get_webhooks()
|
||||
urls = {w.url for w in webhooks}
|
||||
|
||||
assert len(webhooks) == 2
|
||||
assert "https://first.example.com/hook" in urls
|
||||
assert "https://second.example.com/hook" in urls
|
||||
|
||||
def test_unregistered_webhook_no_longer_appears(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
webhook_config: WebhookConfig,
|
||||
) -> None:
|
||||
"""Unregistered webhooks are removed from the list."""
|
||||
webhook_service.register_webhook(webhook_config)
|
||||
webhook_service.unregister_webhook(str(webhook_config.id))
|
||||
|
||||
webhooks = webhook_service.get_webhooks()
|
||||
|
||||
assert len(webhooks) == 0
|
||||
|
||||
def test_unregistering_unknown_id_returns_false(
|
||||
self, webhook_service: WebhookService
|
||||
) -> None:
|
||||
"""Unregistering a non-existent webhook returns False."""
|
||||
result = webhook_service.unregister_webhook(str(uuid4()))
|
||||
|
||||
assert result is False
|
||||
|
||||
|
||||
class TestMeetingCompletedPayload:
|
||||
"""Test meeting.completed webhook payload content."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_payload_contains_meeting_details(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
webhook_config: WebhookConfig,
|
||||
completed_meeting: Meeting,
|
||||
captured_payloads: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""Payload includes meeting ID, title, duration, and segment count."""
|
||||
webhook_service.register_webhook(webhook_config)
|
||||
|
||||
await webhook_service.trigger_meeting_completed(completed_meeting)
|
||||
|
||||
assert len(captured_payloads) == 1
|
||||
payload = captured_payloads[0]
|
||||
|
||||
assert payload["event"] == "meeting.completed"
|
||||
assert payload["meeting_id"] == str(completed_meeting.id)
|
||||
assert payload["title"] == "Q4 Planning Session"
|
||||
assert payload["segment_count"] == 3
|
||||
assert "timestamp" in payload
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_payload_indicates_summary_presence(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
webhook_config: WebhookConfig,
|
||||
meeting_with_summary: Meeting,
|
||||
captured_payloads: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""Payload correctly indicates whether meeting has a summary."""
|
||||
webhook_service.register_webhook(webhook_config)
|
||||
|
||||
await webhook_service.trigger_meeting_completed(meeting_with_summary)
|
||||
|
||||
payload = captured_payloads[0]
|
||||
assert payload["has_summary"] is True
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_no_delivery_when_no_webhooks_registered(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
completed_meeting: Meeting,
|
||||
captured_payloads: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""No payloads are delivered when no webhooks are registered."""
|
||||
deliveries = await webhook_service.trigger_meeting_completed(completed_meeting)
|
||||
|
||||
assert deliveries == []
|
||||
assert len(captured_payloads) == 0
|
||||
|
||||
|
||||
class TestSummaryGeneratedPayload:
|
||||
"""Test summary.generated webhook payload content."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_payload_contains_summary_details(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
webhook_config_all_events: WebhookConfig,
|
||||
meeting_with_summary: Meeting,
|
||||
captured_payloads: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""Payload includes summary content and item counts."""
|
||||
webhook_service.register_webhook(webhook_config_all_events)
|
||||
|
||||
await webhook_service.trigger_summary_generated(meeting_with_summary)
|
||||
|
||||
assert len(captured_payloads) == 1
|
||||
payload = captured_payloads[0]
|
||||
|
||||
assert payload["event"] == "summary.generated"
|
||||
assert payload["meeting_id"] == str(meeting_with_summary.id)
|
||||
assert payload["title"] == "Q4 Planning Session"
|
||||
assert payload["executive_summary"] == "This meeting covered Q4 planning topics."
|
||||
assert payload["key_points_count"] == 2
|
||||
assert payload["action_items_count"] == 2
|
||||
|
||||
|
||||
class TestRecordingPayloads:
|
||||
"""Test recording.started and recording.stopped webhook payloads."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_recording_started_payload(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
webhook_config_all_events: WebhookConfig,
|
||||
captured_payloads: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""Recording started payload contains meeting ID and title."""
|
||||
webhook_service.register_webhook(webhook_config_all_events)
|
||||
|
||||
await webhook_service.trigger_recording_started(
|
||||
meeting_id="meeting-abc-123",
|
||||
title="Weekly Standup",
|
||||
)
|
||||
|
||||
assert len(captured_payloads) == 1
|
||||
payload = captured_payloads[0]
|
||||
|
||||
assert payload["event"] == "recording.started"
|
||||
assert payload["meeting_id"] == "meeting-abc-123"
|
||||
assert payload["title"] == "Weekly Standup"
|
||||
assert "duration_seconds" not in payload # Not present for started event
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_recording_stopped_payload_includes_duration(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
webhook_config_all_events: WebhookConfig,
|
||||
captured_payloads: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""Recording stopped payload includes duration."""
|
||||
webhook_service.register_webhook(webhook_config_all_events)
|
||||
|
||||
await webhook_service.trigger_recording_stopped(
|
||||
meeting_id="meeting-abc-123",
|
||||
title="Weekly Standup",
|
||||
duration_seconds=1847.5,
|
||||
)
|
||||
|
||||
payload = captured_payloads[0]
|
||||
|
||||
assert payload["event"] == "recording.stopped"
|
||||
assert payload["duration_seconds"] == 1847.5
|
||||
|
||||
|
||||
class TestMultipleWebhookDelivery:
|
||||
"""Test delivery to multiple registered webhooks."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_event_delivered_to_all_subscribed_webhooks(
|
||||
self, webhook_service: WebhookService, completed_meeting: Meeting
|
||||
) -> None:
|
||||
"""Event is delivered to all webhooks subscribed to that event type."""
|
||||
for url in ["https://first.example.com/hook", "https://second.example.com/hook"]:
|
||||
webhook_service.register_webhook(
|
||||
WebhookConfig.create(
|
||||
workspace_id=uuid4(), url=url, events=[WebhookEventType.MEETING_COMPLETED]
|
||||
)
|
||||
)
|
||||
|
||||
deliveries = await webhook_service.trigger_meeting_completed(completed_meeting)
|
||||
|
||||
assert len(deliveries) == 2
|
||||
|
||||
|
||||
class TestErrorResilience:
|
||||
"""Test that webhook failures don't break the system."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_executor_exception_does_not_propagate(
|
||||
self,
|
||||
webhook_service: WebhookService,
|
||||
mock_executor: MagicMock,
|
||||
webhook_config: WebhookConfig,
|
||||
completed_meeting: Meeting,
|
||||
) -> None:
|
||||
"""Executor exceptions are caught and don't crash the trigger call."""
|
||||
webhook_service.register_webhook(webhook_config)
|
||||
mock_executor.deliver.side_effect = RuntimeError("Network unreachable")
|
||||
|
||||
# Should complete without raising
|
||||
deliveries = await webhook_service.trigger_meeting_completed(completed_meeting)
|
||||
|
||||
# Empty because exception prevented delivery record creation
|
||||
assert deliveries == []
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_one_failing_webhook_does_not_block_others(
|
||||
self, webhook_service: WebhookService, mock_executor: MagicMock, completed_meeting: Meeting
|
||||
) -> None:
|
||||
"""If one webhook fails, others still receive delivery."""
|
||||
for url in ["https://failing.example.com/hook", "https://working.example.com/hook"]:
|
||||
webhook_service.register_webhook(
|
||||
WebhookConfig.create(
|
||||
workspace_id=uuid4(), url=url, events=[WebhookEventType.MEETING_COMPLETED]
|
||||
)
|
||||
)
|
||||
|
||||
call_count = {"n": 0}
|
||||
|
||||
async def fail_first_then_succeed(
|
||||
config: WebhookConfig, event_type: WebhookEventType, payload: dict[str, Any]
|
||||
) -> WebhookDelivery:
|
||||
call_count["n"] += 1
|
||||
if call_count["n"] == 1:
|
||||
raise RuntimeError("First webhook failed")
|
||||
return WebhookDelivery(
|
||||
id=uuid4(), webhook_id=config.id, event_type=event_type, payload=payload,
|
||||
status_code=200, response_body=None, error_message=None,
|
||||
attempt_count=1, duration_ms=50, delivered_at=utc_now(),
|
||||
)
|
||||
|
||||
mock_executor.deliver = AsyncMock(side_effect=fail_first_then_succeed)
|
||||
deliveries = await webhook_service.trigger_meeting_completed(completed_meeting)
|
||||
|
||||
assert len(deliveries) == 1
|
||||
assert deliveries[0].status_code == 200
|
||||
|
||||
|
||||
class TestServiceLifecycle:
|
||||
"""Test service cleanup behavior."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_close_releases_executor_resources(
|
||||
self, webhook_service: WebhookService, mock_executor: MagicMock
|
||||
) -> None:
|
||||
"""Closing service releases underlying executor resources."""
|
||||
await webhook_service.close()
|
||||
|
||||
mock_executor.close.assert_called_once()
|
||||
@@ -7,17 +7,35 @@ override with more specific monkeypatches when needed.
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import sys
|
||||
import types
|
||||
from pathlib import Path
|
||||
from types import SimpleNamespace
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
from uuid import uuid4
|
||||
|
||||
import pytest
|
||||
|
||||
from noteflow.config.settings import CalendarSettings
|
||||
from noteflow.domain.webhooks import WebhookConfig, WebhookEventType
|
||||
from noteflow.infrastructure.security.crypto import AesGcmCryptoBox
|
||||
from noteflow.infrastructure.security.keystore import InMemoryKeyStore
|
||||
|
||||
# ============================================================================
|
||||
# Platform-specific library path setup (run before pytest collection)
|
||||
# ============================================================================
|
||||
|
||||
# macOS Homebrew: Set library path for WeasyPrint's GLib/GTK dependencies
|
||||
_homebrew_lib = Path("/opt/homebrew/lib")
|
||||
if sys.platform == "darwin" and _homebrew_lib.exists():
|
||||
_current_path = os.environ.get("DYLD_LIBRARY_PATH", "")
|
||||
_homebrew_str = str(_homebrew_lib)
|
||||
if _homebrew_str not in _current_path:
|
||||
os.environ["DYLD_LIBRARY_PATH"] = (
|
||||
f"{_homebrew_str}:{_current_path}" if _current_path else _homebrew_str
|
||||
)
|
||||
|
||||
# ============================================================================
|
||||
# Module-level mocks (run before pytest collection)
|
||||
# ============================================================================
|
||||
@@ -137,6 +155,10 @@ def mock_uow() -> MagicMock:
|
||||
uow.preferences = MagicMock()
|
||||
uow.diarization_jobs = MagicMock()
|
||||
uow.entities = MagicMock()
|
||||
uow.webhooks = MagicMock()
|
||||
uow.integrations = MagicMock()
|
||||
uow.supports_webhooks = True
|
||||
uow.supports_integrations = True
|
||||
return uow
|
||||
|
||||
|
||||
@@ -150,3 +172,42 @@ def crypto() -> AesGcmCryptoBox:
|
||||
def meetings_dir(tmp_path: Path) -> Path:
|
||||
"""Create temporary meetings directory."""
|
||||
return tmp_path / "meetings"
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def webhook_config() -> WebhookConfig:
|
||||
"""Create a webhook config subscribed to MEETING_COMPLETED event."""
|
||||
return WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://example.com/webhook",
|
||||
events=[WebhookEventType.MEETING_COMPLETED],
|
||||
name="Test Webhook",
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def webhook_config_all_events() -> WebhookConfig:
|
||||
"""Create a webhook config subscribed to all events."""
|
||||
return WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://example.com/webhook",
|
||||
events=[
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
WebhookEventType.SUMMARY_GENERATED,
|
||||
WebhookEventType.RECORDING_STARTED,
|
||||
WebhookEventType.RECORDING_STOPPED,
|
||||
],
|
||||
name="All Events Webhook",
|
||||
secret="test-secret-key",
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def calendar_settings() -> CalendarSettings:
|
||||
"""Create test calendar settings for OAuth testing."""
|
||||
return CalendarSettings(
|
||||
google_client_id="test-google-client-id",
|
||||
google_client_secret="test-google-client-secret",
|
||||
outlook_client_id="test-outlook-client-id",
|
||||
outlook_client_secret="test-outlook-client-secret",
|
||||
)
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from datetime import datetime, timedelta
|
||||
from uuid import UUID
|
||||
|
||||
import pytest
|
||||
|
||||
@@ -55,7 +56,7 @@ class TestMeetingCreation:
|
||||
title="Restored Meeting",
|
||||
state=MeetingState.STOPPED,
|
||||
)
|
||||
assert str(meeting.id) == uuid_str
|
||||
assert meeting.id == UUID(uuid_str)
|
||||
assert meeting.title == "Restored Meeting"
|
||||
assert meeting.state == MeetingState.STOPPED
|
||||
|
||||
|
||||
@@ -28,15 +28,15 @@ class TestKeyPoint:
|
||||
kp = KeyPoint(text="Important discussion about architecture")
|
||||
assert getattr(kp, attr) == expected
|
||||
|
||||
def test_key_point_has_evidence_false(self) -> None:
|
||||
"""Test has_evidence returns False when no segment_ids."""
|
||||
def test_key_point_is_sourced_false(self) -> None:
|
||||
"""Test is_sourced returns False when no segment_ids."""
|
||||
kp = KeyPoint(text="No evidence")
|
||||
assert kp.has_evidence() is False
|
||||
assert kp.is_sourced() is False
|
||||
|
||||
def test_key_point_has_evidence_true(self) -> None:
|
||||
"""Test has_evidence returns True with segment_ids."""
|
||||
def test_key_point_is_sourced_true(self) -> None:
|
||||
"""Test is_sourced returns True with segment_ids."""
|
||||
kp = KeyPoint(text="With evidence", segment_ids=[1, 2, 3])
|
||||
assert kp.has_evidence() is True
|
||||
assert kp.is_sourced() is True
|
||||
|
||||
def test_key_point_with_timing(self) -> None:
|
||||
"""Test KeyPoint with timing information."""
|
||||
|
||||
@@ -65,11 +65,11 @@ class TestSprint0Messages:
|
||||
assert hasattr(noteflow_pb2, "ListCalendarEventsResponse"), (
|
||||
"ListCalendarEventsResponse missing"
|
||||
)
|
||||
assert hasattr(noteflow_pb2, "InitiateCalendarAuthRequest"), (
|
||||
"InitiateCalendarAuthRequest missing"
|
||||
assert hasattr(noteflow_pb2, "GetCalendarProvidersRequest"), (
|
||||
"GetCalendarProvidersRequest missing"
|
||||
)
|
||||
assert hasattr(noteflow_pb2, "CompleteCalendarAuthResponse"), (
|
||||
"CompleteCalendarAuthResponse missing"
|
||||
assert hasattr(noteflow_pb2, "GetCalendarProvidersResponse"), (
|
||||
"GetCalendarProvidersResponse missing"
|
||||
)
|
||||
|
||||
def test_export_format_pdf_exists(self) -> None:
|
||||
@@ -96,8 +96,6 @@ class TestSprint0RPCs:
|
||||
servicer = noteflow_pb2_grpc.NoteFlowServiceServicer
|
||||
assert hasattr(servicer, "ListCalendarEvents"), "ListCalendarEvents RPC missing"
|
||||
assert hasattr(servicer, "GetCalendarProviders"), "GetCalendarProviders RPC missing"
|
||||
assert hasattr(servicer, "InitiateCalendarAuth"), "InitiateCalendarAuth RPC missing"
|
||||
assert hasattr(servicer, "CompleteCalendarAuth"), "CompleteCalendarAuth RPC missing"
|
||||
|
||||
def test_stub_rpc_methods_in_init(self) -> None:
|
||||
"""NoteFlowServiceStub __init__ assigns expected RPC methods."""
|
||||
@@ -109,5 +107,3 @@ class TestSprint0RPCs:
|
||||
assert "self.ExtractEntities" in source, "ExtractEntities stub missing"
|
||||
assert "self.ListCalendarEvents" in source, "ListCalendarEvents stub missing"
|
||||
assert "self.GetCalendarProviders" in source, "GetCalendarProviders stub missing"
|
||||
assert "self.InitiateCalendarAuth" in source, "InitiateCalendarAuth stub missing"
|
||||
assert "self.CompleteCalendarAuth" in source, "CompleteCalendarAuth stub missing"
|
||||
|
||||
@@ -288,23 +288,23 @@ class TestMeetingAudioWriterErrors:
|
||||
class TestMeetingAudioWriterProperties:
|
||||
"""Tests for MeetingAudioWriter properties."""
|
||||
|
||||
def test_is_open_property(
|
||||
def test_is_recording_property(
|
||||
self,
|
||||
crypto: AesGcmCryptoBox,
|
||||
meetings_dir: Path,
|
||||
) -> None:
|
||||
"""Test is_open property reflects writer state."""
|
||||
"""Test is_recording property reflects writer state."""
|
||||
writer = MeetingAudioWriter(crypto, meetings_dir)
|
||||
dek = crypto.generate_dek()
|
||||
wrapped_dek = crypto.wrap_dek(dek)
|
||||
|
||||
assert writer.is_open is False
|
||||
assert writer.is_recording is False
|
||||
|
||||
writer.open(str(uuid4()), dek, wrapped_dek)
|
||||
assert writer.is_open is True
|
||||
assert writer.is_recording is True
|
||||
|
||||
writer.close()
|
||||
assert writer.is_open is False
|
||||
assert writer.is_recording is False
|
||||
|
||||
def test_meeting_dir_property(
|
||||
self,
|
||||
|
||||
@@ -12,17 +12,6 @@ from noteflow.domain.value_objects import OAuthProvider
|
||||
from noteflow.infrastructure.calendar.oauth_manager import OAuthError
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def calendar_settings() -> CalendarSettings:
|
||||
"""Create test calendar settings."""
|
||||
return CalendarSettings(
|
||||
google_client_id="test-google-client-id",
|
||||
google_client_secret="test-google-client-secret",
|
||||
outlook_client_id="test-outlook-client-id",
|
||||
outlook_client_secret="test-outlook-client-secret",
|
||||
)
|
||||
|
||||
|
||||
class TestOAuthManagerInitiateAuth:
|
||||
"""Tests for OAuthManager.initiate_auth."""
|
||||
|
||||
|
||||
@@ -6,17 +6,25 @@ import pytest
|
||||
|
||||
from noteflow.domain.entities import ActionItem, KeyPoint, Meeting, Segment, Summary
|
||||
|
||||
try:
|
||||
from weasyprint import HTML as WeasyHTML
|
||||
|
||||
WEASYPRINT_AVAILABLE = True
|
||||
except ImportError:
|
||||
WEASYPRINT_AVAILABLE = False
|
||||
def _check_weasyprint_available() -> bool:
|
||||
"""Check if weasyprint is available with working native libraries."""
|
||||
try:
|
||||
from weasyprint import HTML # noqa: F401
|
||||
|
||||
return True
|
||||
except (ImportError, OSError):
|
||||
# ImportError: weasyprint not installed
|
||||
# OSError: weasyprint installed but native libs (GTK/GLib) missing
|
||||
return False
|
||||
|
||||
|
||||
WEASYPRINT_AVAILABLE = _check_weasyprint_available()
|
||||
|
||||
|
||||
pytestmark = pytest.mark.skipif(
|
||||
not WEASYPRINT_AVAILABLE,
|
||||
reason="weasyprint not installed - install with: pip install noteflow[pdf]",
|
||||
reason="weasyprint not available - requires GTK/GLib native libraries",
|
||||
)
|
||||
|
||||
|
||||
|
||||
1
tests/infrastructure/webhooks/__init__.py
Normal file
1
tests/infrastructure/webhooks/__init__.py
Normal file
@@ -0,0 +1 @@
|
||||
"""Webhook infrastructure tests."""
|
||||
311
tests/infrastructure/webhooks/test_executor.py
Normal file
311
tests/infrastructure/webhooks/test_executor.py
Normal file
@@ -0,0 +1,311 @@
|
||||
"""Unit tests for WebhookExecutor."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import hashlib
|
||||
import hmac
|
||||
import json
|
||||
from unittest.mock import AsyncMock, patch
|
||||
from uuid import uuid4
|
||||
|
||||
import httpx
|
||||
import pytest
|
||||
|
||||
from noteflow.domain.webhooks import WebhookConfig, WebhookEventType
|
||||
from noteflow.infrastructure.webhooks import WebhookExecutor
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def executor() -> WebhookExecutor:
|
||||
"""Create a WebhookExecutor instance."""
|
||||
return WebhookExecutor(max_retries=3, timeout_seconds=5.0)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def enabled_config() -> WebhookConfig:
|
||||
"""Create an enabled webhook config for all events."""
|
||||
return WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://example.com/webhook",
|
||||
events=[
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
WebhookEventType.SUMMARY_GENERATED,
|
||||
],
|
||||
name="Test Webhook",
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def disabled_config() -> WebhookConfig:
|
||||
"""Create a disabled webhook config."""
|
||||
workspace_id = uuid4()
|
||||
now = WebhookConfig.create(workspace_id, "", [WebhookEventType.MEETING_COMPLETED]).created_at
|
||||
return WebhookConfig(
|
||||
id=uuid4(),
|
||||
workspace_id=workspace_id,
|
||||
url="https://example.com/webhook",
|
||||
events=frozenset([WebhookEventType.MEETING_COMPLETED]),
|
||||
name="Disabled Webhook",
|
||||
enabled=False,
|
||||
created_at=now,
|
||||
updated_at=now,
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def signed_config() -> WebhookConfig:
|
||||
"""Create a webhook config with HMAC secret."""
|
||||
return WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://example.com/webhook",
|
||||
events=[WebhookEventType.MEETING_COMPLETED],
|
||||
name="Signed Webhook",
|
||||
secret="test-secret-key",
|
||||
)
|
||||
|
||||
|
||||
class TestWebhookExecutorDelivery:
|
||||
"""Test webhook delivery functionality."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_deliver_success(
|
||||
self, executor: WebhookExecutor, enabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Deliver webhook successfully when server returns 200."""
|
||||
payload = {"event": "meeting.completed", "meeting_id": "123"}
|
||||
|
||||
mock_response = httpx.Response(200)
|
||||
with patch.object(
|
||||
httpx.AsyncClient, "post", new_callable=AsyncMock, return_value=mock_response
|
||||
):
|
||||
delivery = await executor.deliver(
|
||||
enabled_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert delivery.succeeded is True
|
||||
assert delivery.status_code == 200
|
||||
assert delivery.attempt_count == 1
|
||||
assert delivery.error_message is None
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_deliver_disabled_webhook(
|
||||
self, executor: WebhookExecutor, disabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Skip delivery for disabled webhooks."""
|
||||
payload = {"event": "meeting.completed"}
|
||||
|
||||
delivery = await executor.deliver(
|
||||
disabled_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert delivery.succeeded is False
|
||||
assert delivery.attempt_count == 0
|
||||
assert delivery.error_message == "Webhook disabled"
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_deliver_unsubscribed_event(
|
||||
self, executor: WebhookExecutor, enabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Skip delivery for events not in subscription list."""
|
||||
payload = {"event": "recording.started"}
|
||||
|
||||
delivery = await executor.deliver(
|
||||
enabled_config,
|
||||
WebhookEventType.RECORDING_STARTED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert delivery.succeeded is False
|
||||
assert delivery.attempt_count == 0
|
||||
assert "not subscribed" in str(delivery.error_message)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_deliver_retries_on_timeout(
|
||||
self, executor: WebhookExecutor, enabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Retry delivery when request times out."""
|
||||
payload = {"event": "meeting.completed"}
|
||||
|
||||
with patch.object(
|
||||
httpx.AsyncClient,
|
||||
"post",
|
||||
new_callable=AsyncMock,
|
||||
side_effect=httpx.TimeoutException("Timeout"),
|
||||
):
|
||||
delivery = await executor.deliver(
|
||||
enabled_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert delivery.succeeded is False
|
||||
assert delivery.attempt_count == 3 # max_retries
|
||||
assert "Max retries exceeded" in str(delivery.error_message)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_deliver_retries_on_connection_error(
|
||||
self, executor: WebhookExecutor, enabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Retry delivery when connection fails."""
|
||||
payload = {"event": "meeting.completed"}
|
||||
|
||||
with patch.object(
|
||||
httpx.AsyncClient,
|
||||
"post",
|
||||
new_callable=AsyncMock,
|
||||
side_effect=httpx.ConnectError("Connection refused"),
|
||||
):
|
||||
delivery = await executor.deliver(
|
||||
enabled_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert delivery.succeeded is False
|
||||
assert delivery.attempt_count == 3
|
||||
assert "Max retries exceeded" in str(delivery.error_message)
|
||||
|
||||
|
||||
class TestHmacSignature:
|
||||
"""Test HMAC signature generation."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_hmac_signature_generation(
|
||||
self, executor: WebhookExecutor, signed_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Generate valid HMAC-SHA256 signature when secret is configured."""
|
||||
payload = {"event": "meeting.completed", "meeting_id": "123"}
|
||||
|
||||
captured_headers: dict[str, str] = {}
|
||||
|
||||
async def capture_request(*args: object, **kwargs: object) -> httpx.Response:
|
||||
headers = kwargs.get("headers", {})
|
||||
assert isinstance(headers, dict)
|
||||
captured_headers.update(headers)
|
||||
return httpx.Response(200)
|
||||
|
||||
with patch.object(
|
||||
httpx.AsyncClient, "post", new_callable=AsyncMock, side_effect=capture_request
|
||||
):
|
||||
await executor.deliver(
|
||||
signed_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert "X-NoteFlow-Signature" in captured_headers
|
||||
signature_header = captured_headers["X-NoteFlow-Signature"]
|
||||
assert signature_header.startswith("sha256=")
|
||||
|
||||
# Verify signature is correct
|
||||
expected_body = json.dumps(payload, separators=(",", ":"))
|
||||
expected_signature = hmac.new(
|
||||
signed_config.secret.encode(), # type: ignore[union-attr]
|
||||
expected_body.encode(),
|
||||
hashlib.sha256,
|
||||
).hexdigest()
|
||||
|
||||
assert signature_header == f"sha256={expected_signature}"
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_no_signature_without_secret(
|
||||
self, executor: WebhookExecutor, enabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Omit signature header when secret is not configured."""
|
||||
payload = {"event": "meeting.completed"}
|
||||
|
||||
captured_headers: dict[str, str] = {}
|
||||
|
||||
async def capture_request(*args: object, **kwargs: object) -> httpx.Response:
|
||||
headers = kwargs.get("headers", {})
|
||||
assert isinstance(headers, dict)
|
||||
captured_headers.update(headers)
|
||||
return httpx.Response(200)
|
||||
|
||||
with patch.object(
|
||||
httpx.AsyncClient, "post", new_callable=AsyncMock, side_effect=capture_request
|
||||
):
|
||||
await executor.deliver(
|
||||
enabled_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert "X-NoteFlow-Signature" not in captured_headers
|
||||
|
||||
|
||||
class TestWebhookHeaders:
|
||||
"""Test webhook request headers."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_includes_event_header(
|
||||
self, executor: WebhookExecutor, enabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Include event type in X-NoteFlow-Event header."""
|
||||
payload = {"event": "meeting.completed"}
|
||||
|
||||
captured_headers: dict[str, str] = {}
|
||||
|
||||
async def capture_request(*args: object, **kwargs: object) -> httpx.Response:
|
||||
headers = kwargs.get("headers", {})
|
||||
assert isinstance(headers, dict)
|
||||
captured_headers.update(headers)
|
||||
return httpx.Response(200)
|
||||
|
||||
with patch.object(
|
||||
httpx.AsyncClient, "post", new_callable=AsyncMock, side_effect=capture_request
|
||||
):
|
||||
await executor.deliver(
|
||||
enabled_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert captured_headers.get("X-NoteFlow-Event") == "meeting.completed"
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_includes_delivery_id_header(
|
||||
self, executor: WebhookExecutor, enabled_config: WebhookConfig
|
||||
) -> None:
|
||||
"""Include unique delivery ID in X-NoteFlow-Delivery header."""
|
||||
payload = {"event": "meeting.completed"}
|
||||
|
||||
captured_headers: dict[str, str] = {}
|
||||
|
||||
async def capture_request(*args: object, **kwargs: object) -> httpx.Response:
|
||||
headers = kwargs.get("headers", {})
|
||||
assert isinstance(headers, dict)
|
||||
captured_headers.update(headers)
|
||||
return httpx.Response(200)
|
||||
|
||||
with patch.object(
|
||||
httpx.AsyncClient, "post", new_callable=AsyncMock, side_effect=capture_request
|
||||
):
|
||||
await executor.deliver(
|
||||
enabled_config,
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
payload,
|
||||
)
|
||||
|
||||
assert "X-NoteFlow-Delivery" in captured_headers
|
||||
# Verify it's a valid UUID format
|
||||
delivery_id = captured_headers["X-NoteFlow-Delivery"]
|
||||
assert len(delivery_id) == 36 # UUID length with hyphens
|
||||
|
||||
|
||||
class TestExecutorCleanup:
|
||||
"""Test executor resource cleanup."""
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_close_cleans_up_client(self, executor: WebhookExecutor) -> None:
|
||||
"""Close method cleans up HTTP client."""
|
||||
# Trigger client creation
|
||||
await executor._ensure_client()
|
||||
assert executor._client is not None
|
||||
|
||||
await executor.close()
|
||||
assert executor._client is None
|
||||
@@ -56,18 +56,18 @@ class TestDiarizationJobRepository:
|
||||
created = await job_repo.create(job)
|
||||
await session.commit()
|
||||
|
||||
assert created.job_id == job.job_id
|
||||
assert created.status == JOB_STATUS_QUEUED
|
||||
assert created.job_id == job.job_id, "created job should have same job_id"
|
||||
assert created.status == JOB_STATUS_QUEUED, "created job should have queued status"
|
||||
|
||||
retrieved = await job_repo.get(job.job_id)
|
||||
|
||||
assert retrieved is not None
|
||||
assert retrieved.job_id == job.job_id
|
||||
assert retrieved.meeting_id == str(meeting.id)
|
||||
assert retrieved.status == JOB_STATUS_QUEUED
|
||||
assert retrieved.segments_updated == 0
|
||||
assert retrieved.speaker_ids == []
|
||||
assert retrieved.error_message == ""
|
||||
assert retrieved is not None, "get should return the created job"
|
||||
assert retrieved.job_id == job.job_id, "retrieved job_id should match"
|
||||
assert retrieved.meeting_id == job.meeting_id, "retrieved meeting_id should match"
|
||||
assert retrieved.status == JOB_STATUS_QUEUED, "retrieved status should be queued"
|
||||
assert retrieved.segments_updated == 0, "segments_updated should default to 0"
|
||||
assert retrieved.speaker_ids == [], "speaker_ids should default to empty list"
|
||||
assert retrieved.error_message == "", "error_message should default to empty string"
|
||||
|
||||
async def test_get_nonexistent_job_returns_none(self, session: AsyncSession) -> None:
|
||||
"""Test retrieving a job that doesn't exist returns None."""
|
||||
@@ -129,14 +129,14 @@ class TestDiarizationJobRepository:
|
||||
)
|
||||
await session.commit()
|
||||
|
||||
assert updated is True
|
||||
assert updated is True, "update_status should return True on success"
|
||||
|
||||
retrieved = await job_repo.get(job.job_id)
|
||||
assert retrieved is not None
|
||||
assert retrieved.status == JOB_STATUS_COMPLETED
|
||||
assert retrieved.segments_updated == 42
|
||||
assert retrieved.speaker_ids == speaker_ids
|
||||
assert retrieved.error_message == ""
|
||||
assert retrieved is not None, "job should be retrievable after update"
|
||||
assert retrieved.status == JOB_STATUS_COMPLETED, "status should be completed"
|
||||
assert retrieved.segments_updated == 42, "segments_updated should be persisted"
|
||||
assert retrieved.speaker_ids == speaker_ids, "speaker_ids should be persisted"
|
||||
assert retrieved.error_message == "", "error_message should remain empty"
|
||||
|
||||
async def test_update_status_to_failed_with_error(self, session: AsyncSession) -> None:
|
||||
"""Test updating job status to FAILED with error message."""
|
||||
@@ -261,12 +261,12 @@ class TestDiarizationJobCrashRecovery:
|
||||
failed_count = await job_repo.mark_running_as_failed("Server crashed")
|
||||
await session.commit()
|
||||
|
||||
assert failed_count == 1
|
||||
assert failed_count == 1, "should mark 1 queued job as failed"
|
||||
|
||||
retrieved = await job_repo.get(job.job_id)
|
||||
assert retrieved is not None
|
||||
assert retrieved.status == JOB_STATUS_FAILED
|
||||
assert retrieved.error_message == "Server crashed"
|
||||
assert retrieved is not None, "job should still be retrievable"
|
||||
assert retrieved.status == JOB_STATUS_FAILED, "status should be FAILED"
|
||||
assert retrieved.error_message == "Server crashed", "error message should be set"
|
||||
|
||||
async def test_mark_running_as_failed_recovers_running_jobs(
|
||||
self, session: AsyncSession
|
||||
@@ -290,11 +290,11 @@ class TestDiarizationJobCrashRecovery:
|
||||
failed_count = await job_repo.mark_running_as_failed()
|
||||
await session.commit()
|
||||
|
||||
assert failed_count == 1
|
||||
assert failed_count == 1, "should mark 1 running job"
|
||||
|
||||
retrieved = await job_repo.get(job.job_id)
|
||||
assert retrieved is not None
|
||||
assert retrieved.status == JOB_STATUS_FAILED
|
||||
assert retrieved is not None, "job should still be retrievable"
|
||||
assert retrieved.status == JOB_STATUS_FAILED, "status should be FAILED"
|
||||
|
||||
async def test_mark_running_as_failed_ignores_completed_jobs(
|
||||
self, session: AsyncSession
|
||||
@@ -319,12 +319,12 @@ class TestDiarizationJobCrashRecovery:
|
||||
failed_count = await job_repo.mark_running_as_failed()
|
||||
await session.commit()
|
||||
|
||||
assert failed_count == 0
|
||||
assert failed_count == 0, "should not mark completed jobs"
|
||||
|
||||
retrieved = await job_repo.get(job.job_id)
|
||||
assert retrieved is not None
|
||||
assert retrieved.status == JOB_STATUS_COMPLETED
|
||||
assert retrieved.segments_updated == 10
|
||||
assert retrieved is not None, "completed job should exist"
|
||||
assert retrieved.status == JOB_STATUS_COMPLETED, "status should remain COMPLETED"
|
||||
assert retrieved.segments_updated == 10, "segments_updated should remain unchanged"
|
||||
|
||||
async def test_mark_running_as_failed_ignores_already_failed_jobs(
|
||||
self, session: AsyncSession
|
||||
@@ -391,14 +391,14 @@ class TestDiarizationJobCrashRecovery:
|
||||
failed_count = await job_repo.mark_running_as_failed()
|
||||
await session.commit()
|
||||
|
||||
assert failed_count == 2
|
||||
assert failed_count == 2, "should mark queued and running jobs"
|
||||
|
||||
j1 = await job_repo.get(queued_job.job_id)
|
||||
j2 = await job_repo.get(running_job.job_id)
|
||||
j3 = await job_repo.get(completed_job.job_id)
|
||||
assert j1 is not None and j1.status == JOB_STATUS_FAILED
|
||||
assert j2 is not None and j2.status == JOB_STATUS_FAILED
|
||||
assert j3 is not None and j3.status == JOB_STATUS_COMPLETED
|
||||
assert j1 is not None and j1.status == JOB_STATUS_FAILED, "queued job should be failed"
|
||||
assert j2 is not None and j2.status == JOB_STATUS_FAILED, "running job should be failed"
|
||||
assert j3 is not None and j3.status == JOB_STATUS_COMPLETED, "completed job unchanged"
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
|
||||
@@ -90,10 +90,10 @@ class TestUnitOfWorkCrossRepositoryOperations:
|
||||
segments = await uow.segments.get_by_meeting(meeting.id)
|
||||
s = await uow.summaries.get_by_meeting(meeting.id)
|
||||
|
||||
assert m is not None
|
||||
assert len(segments) == 3
|
||||
assert s is not None
|
||||
assert s.executive_summary == "Test summary"
|
||||
assert m is not None, "meeting should be retrievable"
|
||||
assert len(segments) == 3, "all 3 segments should be persisted"
|
||||
assert s is not None, "summary should be retrievable"
|
||||
assert s.executive_summary == "Test summary", "summary content should match"
|
||||
|
||||
async def test_meeting_deletion_cascades_to_segments_and_summary(
|
||||
self, session_factory: async_sessionmaker[AsyncSession]
|
||||
@@ -175,10 +175,10 @@ class TestUnitOfWorkConcurrency:
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
m1 = await uow.meetings.get(meeting1.id)
|
||||
m2 = await uow.meetings.get(meeting2.id)
|
||||
assert m1 is not None
|
||||
assert m2 is not None
|
||||
assert m1.title == "Meeting 1"
|
||||
assert m2.title == "Meeting 2"
|
||||
assert m1 is not None, "meeting1 should be retrievable"
|
||||
assert m2 is not None, "meeting2 should be retrievable"
|
||||
assert m1.title == "Meeting 1", "meeting1 title should match"
|
||||
assert m2.title == "Meeting 2", "meeting2 title should match"
|
||||
|
||||
async def test_concurrent_updates_to_different_meetings(
|
||||
self, session_factory: async_sessionmaker[AsyncSession]
|
||||
@@ -194,7 +194,7 @@ class TestUnitOfWorkConcurrency:
|
||||
async def update_m1() -> None:
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
meeting = await uow.meetings.get(m1.id)
|
||||
assert meeting is not None
|
||||
assert meeting is not None, "m1 should exist for update"
|
||||
meeting.start_recording()
|
||||
await uow.meetings.update(meeting)
|
||||
await uow.commit()
|
||||
@@ -202,7 +202,7 @@ class TestUnitOfWorkConcurrency:
|
||||
async def update_m2() -> None:
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
meeting = await uow.meetings.get(m2.id)
|
||||
assert meeting is not None
|
||||
assert meeting is not None, "m2 should exist for update"
|
||||
meeting.start_recording()
|
||||
await uow.meetings.update(meeting)
|
||||
await uow.commit()
|
||||
@@ -212,8 +212,8 @@ class TestUnitOfWorkConcurrency:
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
final_m1 = await uow.meetings.get(m1.id)
|
||||
final_m2 = await uow.meetings.get(m2.id)
|
||||
assert final_m1 is not None and final_m1.state == MeetingState.RECORDING
|
||||
assert final_m2 is not None and final_m2.state == MeetingState.RECORDING
|
||||
assert final_m1 is not None and final_m1.state == MeetingState.RECORDING, "m1 recording"
|
||||
assert final_m2 is not None and final_m2.state == MeetingState.RECORDING, "m2 recording"
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
|
||||
212
tests/integration/test_webhook_integration.py
Normal file
212
tests/integration/test_webhook_integration.py
Normal file
@@ -0,0 +1,212 @@
|
||||
"""Integration tests for webhook triggering in gRPC service.
|
||||
|
||||
Tests the complete webhook flow from gRPC operations to webhook delivery.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING, Any
|
||||
from unittest.mock import AsyncMock, MagicMock
|
||||
from uuid import uuid4
|
||||
|
||||
import grpc
|
||||
import pytest
|
||||
|
||||
from noteflow.application.services.webhook_service import WebhookService
|
||||
from noteflow.domain.entities import Meeting, Segment
|
||||
from noteflow.domain.webhooks import WebhookConfig, WebhookDelivery, WebhookEventType
|
||||
from noteflow.grpc.proto import noteflow_pb2
|
||||
from noteflow.grpc.service import NoteFlowServicer
|
||||
from noteflow.infrastructure.persistence.unit_of_work import SqlAlchemyUnitOfWork
|
||||
from noteflow.infrastructure.webhooks import WebhookExecutor
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker
|
||||
|
||||
|
||||
class MockGrpcContext:
|
||||
"""Mock gRPC context for testing."""
|
||||
|
||||
def __init__(self) -> None:
|
||||
"""Initialize mock context."""
|
||||
self.aborted = False
|
||||
self.abort_code: grpc.StatusCode | None = None
|
||||
self.abort_details: str | None = None
|
||||
|
||||
async def abort(self, code: grpc.StatusCode, details: str) -> None:
|
||||
"""Record abort and raise to simulate gRPC behavior."""
|
||||
self.aborted = True
|
||||
self.abort_code = code
|
||||
self.abort_details = details
|
||||
raise grpc.RpcError()
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def captured_webhook_calls() -> list[dict[str, Any]]:
|
||||
"""Store webhook calls for verification."""
|
||||
return []
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def mock_webhook_executor(captured_webhook_calls: list[dict[str, Any]]) -> MagicMock:
|
||||
"""Create a mock executor that captures calls."""
|
||||
executor = MagicMock(spec=WebhookExecutor)
|
||||
|
||||
async def capture_delivery(
|
||||
config: WebhookConfig,
|
||||
event_type: WebhookEventType,
|
||||
payload: dict[str, Any],
|
||||
) -> WebhookDelivery:
|
||||
captured_webhook_calls.append({
|
||||
"config_url": config.url,
|
||||
"event_type": event_type,
|
||||
"payload": payload,
|
||||
})
|
||||
return WebhookDelivery.create(
|
||||
webhook_id=config.id,
|
||||
event_type=event_type,
|
||||
payload=payload,
|
||||
status_code=200,
|
||||
)
|
||||
|
||||
executor.deliver = AsyncMock(side_effect=capture_delivery)
|
||||
executor.close = AsyncMock()
|
||||
return executor
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def webhook_service_with_config(
|
||||
mock_webhook_executor: MagicMock,
|
||||
) -> WebhookService:
|
||||
"""Create a webhook service with a registered webhook."""
|
||||
service = WebhookService(executor=mock_webhook_executor)
|
||||
config = WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://test.example.com/webhook",
|
||||
events=[
|
||||
WebhookEventType.MEETING_COMPLETED,
|
||||
WebhookEventType.RECORDING_STARTED,
|
||||
WebhookEventType.RECORDING_STOPPED,
|
||||
],
|
||||
name="Integration Test Webhook",
|
||||
)
|
||||
service.register_webhook(config)
|
||||
return service
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
class TestStopMeetingTriggersWebhook:
|
||||
"""Test that StopMeeting triggers webhook delivery."""
|
||||
|
||||
async def test_stop_meeting_triggers_meeting_completed_webhook(
|
||||
self,
|
||||
session_factory: async_sessionmaker[AsyncSession],
|
||||
webhook_service_with_config: WebhookService,
|
||||
captured_webhook_calls: list[dict[str, Any]],
|
||||
) -> None:
|
||||
"""Stopping a meeting triggers meeting.completed webhook."""
|
||||
# Create a meeting in recording state with a segment
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
meeting = Meeting.create(title="Webhook Integration Test")
|
||||
meeting.start_recording()
|
||||
await uow.meetings.create(meeting)
|
||||
segment = Segment(
|
||||
segment_id=0,
|
||||
text="Test segment content",
|
||||
start_time=0.0,
|
||||
end_time=5.0,
|
||||
meeting_id=meeting.id,
|
||||
)
|
||||
await uow.segments.add(meeting.id, segment)
|
||||
await uow.commit()
|
||||
meeting_id = str(meeting.id)
|
||||
|
||||
servicer = NoteFlowServicer(
|
||||
session_factory=session_factory,
|
||||
webhook_service=webhook_service_with_config,
|
||||
)
|
||||
|
||||
request = noteflow_pb2.StopMeetingRequest(meeting_id=meeting_id)
|
||||
result = await servicer.StopMeeting(request, MockGrpcContext())
|
||||
|
||||
# StopMeeting returns Meeting proto directly - state should be STOPPED
|
||||
assert result.state == noteflow_pb2.MEETING_STATE_STOPPED
|
||||
|
||||
# Verify webhooks were triggered (recording.stopped + meeting.completed)
|
||||
assert len(captured_webhook_calls) == 2
|
||||
|
||||
event_types = {call["event_type"] for call in captured_webhook_calls}
|
||||
assert WebhookEventType.RECORDING_STOPPED in event_types
|
||||
assert WebhookEventType.MEETING_COMPLETED in event_types
|
||||
|
||||
# Verify meeting.completed payload
|
||||
completed_call = next(
|
||||
c for c in captured_webhook_calls
|
||||
if c["event_type"] == WebhookEventType.MEETING_COMPLETED
|
||||
)
|
||||
assert completed_call["payload"]["meeting_id"] == meeting_id
|
||||
assert completed_call["payload"]["title"] == "Webhook Integration Test"
|
||||
|
||||
async def test_stop_meeting_with_failed_webhook_still_succeeds(
|
||||
self,
|
||||
session_factory: async_sessionmaker[AsyncSession],
|
||||
mock_webhook_executor: MagicMock,
|
||||
) -> None:
|
||||
"""Meeting stop succeeds even when webhook delivery fails."""
|
||||
mock_webhook_executor.deliver = AsyncMock(
|
||||
side_effect=RuntimeError("Webhook server unreachable")
|
||||
)
|
||||
|
||||
webhook_service = WebhookService(executor=mock_webhook_executor)
|
||||
webhook_service.register_webhook(
|
||||
WebhookConfig.create(
|
||||
workspace_id=uuid4(),
|
||||
url="https://failing.example.com/webhook",
|
||||
events=[WebhookEventType.MEETING_COMPLETED],
|
||||
)
|
||||
)
|
||||
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
meeting = Meeting.create(title="Webhook Failure Test")
|
||||
meeting.start_recording()
|
||||
await uow.meetings.create(meeting)
|
||||
await uow.commit()
|
||||
meeting_id = str(meeting.id)
|
||||
|
||||
servicer = NoteFlowServicer(
|
||||
session_factory=session_factory,
|
||||
webhook_service=webhook_service,
|
||||
)
|
||||
|
||||
request = noteflow_pb2.StopMeetingRequest(meeting_id=meeting_id)
|
||||
result = await servicer.StopMeeting(request, MockGrpcContext())
|
||||
|
||||
# Meeting stop succeeds despite webhook failure
|
||||
assert result.state == noteflow_pb2.MEETING_STATE_STOPPED
|
||||
|
||||
|
||||
@pytest.mark.integration
|
||||
class TestNoWebhookServiceGracefulDegradation:
|
||||
"""Test that operations work without webhook service configured."""
|
||||
|
||||
async def test_stop_meeting_works_without_webhook_service(
|
||||
self,
|
||||
session_factory: async_sessionmaker[AsyncSession],
|
||||
) -> None:
|
||||
"""Meeting operations work when no webhook service is configured."""
|
||||
async with SqlAlchemyUnitOfWork(session_factory) as uow:
|
||||
meeting = Meeting.create(title="No Webhooks Test")
|
||||
meeting.start_recording()
|
||||
await uow.meetings.create(meeting)
|
||||
await uow.commit()
|
||||
meeting_id = str(meeting.id)
|
||||
|
||||
servicer = NoteFlowServicer(
|
||||
session_factory=session_factory,
|
||||
webhook_service=None,
|
||||
)
|
||||
|
||||
request = noteflow_pb2.StopMeetingRequest(meeting_id=meeting_id)
|
||||
result = await servicer.StopMeeting(request, MockGrpcContext())
|
||||
|
||||
assert result.state == noteflow_pb2.MEETING_STATE_STOPPED
|
||||
@@ -72,11 +72,14 @@ def find_python_files(root: Path, exclude_protocols: bool = False) -> list[Path]
|
||||
|
||||
Args:
|
||||
root: Root directory to search.
|
||||
exclude_protocols: If True, exclude protocol/port files (interfaces).
|
||||
exclude_protocols: If True, exclude protocol/port files and repository
|
||||
implementations (interfaces + their implementations are expected to match).
|
||||
"""
|
||||
excluded = {"*_pb2.py", "*_pb2_grpc.py", "*_pb2.pyi"}
|
||||
# Protocol/port files define interfaces - implementations are expected to match
|
||||
protocol_patterns = {"protocols.py", "ports.py"} if exclude_protocols else set()
|
||||
# Repository implementations implement Protocol interfaces - matching signatures expected
|
||||
repo_dir_patterns = {"repositories", "memory"} if exclude_protocols else set()
|
||||
|
||||
files: list[Path] = []
|
||||
for py_file in root.rglob("*.py"):
|
||||
@@ -88,6 +91,9 @@ def find_python_files(root: Path, exclude_protocols: bool = False) -> list[Path]
|
||||
continue
|
||||
if exclude_protocols and "ports" in py_file.parts:
|
||||
continue
|
||||
# Exclude repository implementations (they implement Protocol interfaces)
|
||||
if exclude_protocols and any(d in py_file.parts for d in repo_dir_patterns):
|
||||
continue
|
||||
files.append(py_file)
|
||||
|
||||
return files
|
||||
@@ -119,7 +125,6 @@ def test_helpers_not_scattered() -> None:
|
||||
)
|
||||
|
||||
# Target: 15 scattered helpers max - some duplication is expected for:
|
||||
# - Repository implementations (memory + SQL)
|
||||
# - Client/server pairs with same method names
|
||||
# - Mixin protocols + implementations
|
||||
assert len(scattered) <= 15, (
|
||||
@@ -191,7 +196,6 @@ def test_no_duplicate_helper_implementations() -> None:
|
||||
duplicates.append(f"'{signature}' defined at: {', '.join(loc_strs)}")
|
||||
|
||||
# Target: 25 duplicate helper signatures - some duplication expected for:
|
||||
# - Repository pattern (memory + SQL implementations)
|
||||
# - Mixin composition (protocol + implementation)
|
||||
# - Client/server pairs
|
||||
assert len(duplicates) <= 25, (
|
||||
|
||||
Reference in New Issue
Block a user