chore: complete v1.0 Analytics & Monitoring milestone
Archive milestone artifacts (roadmap, requirements, audit, phase directories) to .planning/milestones/. Evolve PROJECT.md with validated requirements and decision outcomes. Create MILESTONES.md and RETROSPECTIVE.md. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -0,0 +1,176 @@
|
||||
---
|
||||
phase: 02-backend-services
|
||||
plan: 01
|
||||
type: execute
|
||||
wave: 1
|
||||
depends_on: []
|
||||
files_modified:
|
||||
- backend/src/models/migrations/013_create_processing_events_table.sql
|
||||
- backend/src/services/analyticsService.ts
|
||||
- backend/src/__tests__/unit/analyticsService.test.ts
|
||||
autonomous: true
|
||||
requirements: [ANLY-01, ANLY-03]
|
||||
|
||||
must_haves:
|
||||
truths:
|
||||
- "recordProcessingEvent() writes to document_processing_events table via Supabase"
|
||||
- "recordProcessingEvent() returns void (not Promise) so callers cannot accidentally await it"
|
||||
- "A deliberate Supabase write failure logs an error but does not throw or reject"
|
||||
- "deleteProcessingEventsOlderThan(30) removes rows older than 30 days"
|
||||
artifacts:
|
||||
- path: "backend/src/models/migrations/013_create_processing_events_table.sql"
|
||||
provides: "document_processing_events table DDL with indexes and RLS"
|
||||
contains: "CREATE TABLE IF NOT EXISTS document_processing_events"
|
||||
- path: "backend/src/services/analyticsService.ts"
|
||||
provides: "Fire-and-forget analytics event writer and retention delete"
|
||||
exports: ["recordProcessingEvent", "deleteProcessingEventsOlderThan"]
|
||||
- path: "backend/src/__tests__/unit/analyticsService.test.ts"
|
||||
provides: "Unit tests for analyticsService"
|
||||
min_lines: 50
|
||||
key_links:
|
||||
- from: "backend/src/services/analyticsService.ts"
|
||||
to: "backend/src/config/supabase.ts"
|
||||
via: "getSupabaseServiceClient() call"
|
||||
pattern: "getSupabaseServiceClient"
|
||||
- from: "backend/src/services/analyticsService.ts"
|
||||
to: "document_processing_events table"
|
||||
via: "void supabase.from('document_processing_events').insert(...)"
|
||||
pattern: "void.*from\\('document_processing_events'\\)"
|
||||
---
|
||||
|
||||
<objective>
|
||||
Create the analytics migration and fire-and-forget analytics service for persisting document processing events to Supabase.
|
||||
|
||||
Purpose: ANLY-01 requires processing events to persist (not in-memory), and ANLY-03 requires instrumentation to be non-blocking. This plan creates the database table and the service that writes to it without blocking the processing pipeline.
|
||||
|
||||
Output: Migration 013 SQL file, analyticsService.ts with recordProcessingEvent() and deleteProcessingEventsOlderThan(), and unit tests.
|
||||
</objective>
|
||||
|
||||
<execution_context>
|
||||
@/home/jonathan/.claude/get-shit-done/workflows/execute-plan.md
|
||||
@/home/jonathan/.claude/get-shit-done/templates/summary.md
|
||||
</execution_context>
|
||||
|
||||
<context>
|
||||
@.planning/PROJECT.md
|
||||
@.planning/ROADMAP.md
|
||||
@.planning/STATE.md
|
||||
@.planning/phases/02-backend-services/02-RESEARCH.md
|
||||
@.planning/phases/01-data-foundation/01-01-SUMMARY.md
|
||||
@.planning/phases/01-data-foundation/01-02-SUMMARY.md
|
||||
@backend/src/models/migrations/012_create_monitoring_tables.sql
|
||||
@backend/src/config/supabase.ts
|
||||
@backend/src/utils/logger.ts
|
||||
</context>
|
||||
|
||||
<tasks>
|
||||
|
||||
<task type="auto">
|
||||
<name>Task 1: Create analytics migration and analyticsService</name>
|
||||
<files>
|
||||
backend/src/models/migrations/013_create_processing_events_table.sql
|
||||
backend/src/services/analyticsService.ts
|
||||
</files>
|
||||
<action>
|
||||
**Migration 013:** Create `backend/src/models/migrations/013_create_processing_events_table.sql` following the exact pattern from migration 012. The table:
|
||||
|
||||
```sql
|
||||
CREATE TABLE IF NOT EXISTS document_processing_events (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
document_id UUID NOT NULL,
|
||||
user_id UUID NOT NULL,
|
||||
event_type TEXT NOT NULL CHECK (event_type IN ('upload_started', 'processing_started', 'completed', 'failed')),
|
||||
duration_ms INTEGER,
|
||||
error_message TEXT,
|
||||
stage TEXT,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_document_processing_events_created_at
|
||||
ON document_processing_events(created_at);
|
||||
CREATE INDEX IF NOT EXISTS idx_document_processing_events_document_id
|
||||
ON document_processing_events(document_id);
|
||||
|
||||
ALTER TABLE document_processing_events ENABLE ROW LEVEL SECURITY;
|
||||
```
|
||||
|
||||
**analyticsService.ts:** Create `backend/src/services/analyticsService.ts` with two exports:
|
||||
|
||||
1. `recordProcessingEvent(data: ProcessingEventData): void` — Return type MUST be `void` (not `Promise<void>`) to prevent accidental `await`. Inside, call `getSupabaseServiceClient()` (per-method, not module level), then `void supabase.from('document_processing_events').insert({...}).then(({ error }) => { if (error) logger.error(...) })`. Never throw, never reject.
|
||||
|
||||
2. `deleteProcessingEventsOlderThan(days: number): Promise<number>` — Compute cutoff date in JS (`new Date(Date.now() - days * 86400000).toISOString()`), then delete with `.lt('created_at', cutoff)`. Return the count of deleted rows. This follows the same pattern as `HealthCheckModel.deleteOlderThan()`.
|
||||
|
||||
Export the `ProcessingEventData` interface:
|
||||
```typescript
|
||||
export interface ProcessingEventData {
|
||||
document_id: string;
|
||||
user_id: string;
|
||||
event_type: 'upload_started' | 'processing_started' | 'completed' | 'failed';
|
||||
duration_ms?: number;
|
||||
error_message?: string;
|
||||
stage?: string;
|
||||
}
|
||||
```
|
||||
|
||||
Use Winston logger (`import { logger } from '../utils/logger'`). Use `getSupabaseServiceClient` from `'../config/supabase'`. Follow project naming conventions (camelCase file, named exports).
|
||||
</action>
|
||||
<verify>
|
||||
<automated>cd /home/jonathan/Coding/cim_summary/backend && npx tsc --noEmit --pretty 2>&1 | head -30</automated>
|
||||
<manual>Verify 013 migration file exists and analyticsService exports recordProcessingEvent and deleteProcessingEventsOlderThan</manual>
|
||||
</verify>
|
||||
<done>Migration 013 creates document_processing_events table with indexes and RLS. analyticsService.ts exports recordProcessingEvent (void return) and deleteProcessingEventsOlderThan (Promise<number>). TypeScript compiles.</done>
|
||||
</task>
|
||||
|
||||
<task type="auto">
|
||||
<name>Task 2: Create analyticsService unit tests</name>
|
||||
<files>
|
||||
backend/src/__tests__/unit/analyticsService.test.ts
|
||||
</files>
|
||||
<action>
|
||||
Create `backend/src/__tests__/unit/analyticsService.test.ts` using the Vitest + Supabase mock pattern established in Phase 1 (01-02-SUMMARY.md).
|
||||
|
||||
Mock setup:
|
||||
- `vi.mock('../../config/supabase')` with inline `vi.fn()` factory
|
||||
- `vi.mock('../../utils/logger')` with inline `vi.fn()` factory
|
||||
- Use `vi.mocked()` after import for typed access
|
||||
- `makeSupabaseChain()` helper per test (fresh mock state)
|
||||
|
||||
Test cases for `recordProcessingEvent`:
|
||||
1. **Calls Supabase insert with correct data** — verify `.from('document_processing_events').insert(...)` called with expected fields including `created_at`
|
||||
2. **Return type is void (not a Promise)** — call `recordProcessingEvent(data)` and verify the return value is `undefined` (void), not a thenable
|
||||
3. **Logs error on Supabase failure but does not throw** — mock the `.then` callback with `{ error: { message: 'test error' } }`, verify `logger.error` was called
|
||||
4. **Handles optional fields (duration_ms, error_message, stage) as null** — pass data without optional fields, verify insert called with `null` for those columns
|
||||
|
||||
Test cases for `deleteProcessingEventsOlderThan`:
|
||||
5. **Computes correct cutoff date and deletes** — mock Supabase delete chain, verify `.lt('created_at', ...)` called with ISO date string ~30 days ago
|
||||
6. **Returns count of deleted rows** — mock response with `data: [{}, {}, {}]` (3 rows), verify returns 3
|
||||
|
||||
Use `beforeEach(() => vi.clearAllMocks())` for test isolation.
|
||||
</action>
|
||||
<verify>
|
||||
<automated>cd /home/jonathan/Coding/cim_summary/backend && npx vitest run src/__tests__/unit/analyticsService.test.ts --reporter=verbose 2>&1</automated>
|
||||
</verify>
|
||||
<done>All analyticsService tests pass. recordProcessingEvent verified as fire-and-forget (void return, error-swallowing). deleteProcessingEventsOlderThan verified with correct date math and row count return.</done>
|
||||
</task>
|
||||
|
||||
</tasks>
|
||||
|
||||
<verification>
|
||||
1. `npx tsc --noEmit` passes with no errors from new files
|
||||
2. `npx vitest run src/__tests__/unit/analyticsService.test.ts` — all tests pass
|
||||
3. Migration 013 SQL is valid and follows 012 pattern
|
||||
4. `recordProcessingEvent` return type is `void` (not `Promise<void>`)
|
||||
</verification>
|
||||
|
||||
<success_criteria>
|
||||
- Migration 013 creates document_processing_events table with id, document_id, user_id, event_type (CHECK constraint), duration_ms, error_message, stage, created_at
|
||||
- Indexes on created_at and document_id exist
|
||||
- RLS enabled on the table
|
||||
- analyticsService.recordProcessingEvent() is fire-and-forget (void return, no throw)
|
||||
- analyticsService.deleteProcessingEventsOlderThan() returns deleted row count
|
||||
- All unit tests pass
|
||||
</success_criteria>
|
||||
|
||||
<output>
|
||||
After completion, create `.planning/phases/02-backend-services/02-01-SUMMARY.md`
|
||||
</output>
|
||||
Reference in New Issue
Block a user