Files
cim_summary/.planning/milestones/v1.0-phases/03-api-layer/03-02-PLAN.md
admin 38a0f0619d chore: complete v1.0 Analytics & Monitoring milestone
Archive milestone artifacts (roadmap, requirements, audit, phase directories)
to .planning/milestones/. Evolve PROJECT.md with validated requirements and
decision outcomes. Create MILESTONES.md and RETROSPECTIVE.md.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 10:34:18 -05:00

5.5 KiB

phase, plan, type, wave, depends_on, files_modified, autonomous, requirements, must_haves
phase plan type wave depends_on files_modified autonomous requirements must_haves
03-api-layer 02 execute 1
backend/src/services/jobProcessorService.ts
true
ANLY-02
truths artifacts key_links
Document processing emits upload_started event after job is marked as processing
Document processing emits completed event with duration_ms after job succeeds
Document processing emits failed event with duration_ms and error_message when job fails
Analytics instrumentation does not change existing processing behavior or error handling
path provides contains
backend/src/services/jobProcessorService.ts Analytics instrumentation at 3 lifecycle points in processJob() recordProcessingEvent
from to via pattern
backend/src/services/jobProcessorService.ts backend/src/services/analyticsService.ts import and call recordProcessingEvent() recordProcessingEvent
Instrument the document processing pipeline with fire-and-forget analytics events at key lifecycle points.

Purpose: Enables the analytics endpoint (Plan 03-01) to report real processing data. Without instrumentation, document_processing_events table stays empty and GET /admin/analytics returns zeros.

Output: Three recordProcessingEvent() calls in jobProcessorService.processJob() — one at job start, one at completion, one at failure.

<execution_context> @/home/jonathan/.claude/get-shit-done/workflows/execute-plan.md @/home/jonathan/.claude/get-shit-done/templates/summary.md </execution_context>

@.planning/PROJECT.md @.planning/ROADMAP.md @.planning/STATE.md @.planning/phases/03-api-layer/03-RESEARCH.md

@backend/src/services/jobProcessorService.ts @backend/src/services/analyticsService.ts

Task 1: Add analytics instrumentation to processJob lifecycle backend/src/services/jobProcessorService.ts **1. Add import at top of file:**
import { recordProcessingEvent } from './analyticsService';

2. Emit upload_started after markAsProcessing (line ~133):

After await ProcessingJobModel.markAsProcessing(jobId); and jobStatusUpdated = true;, add:

// Analytics: job processing started (fire-and-forget, void return)
recordProcessingEvent({
  document_id: job.document_id,
  user_id: job.user_id,
  event_type: 'upload_started',
});

Place this BEFORE the timeout setup block (before const processingTimeout = ...).

3. Emit completed after markAsCompleted (line ~329):

After const processingTime = Date.now() - startTime; and the logger.info('Job completed successfully', ...) call, add:

// Analytics: job completed (fire-and-forget, void return)
recordProcessingEvent({
  document_id: job.document_id,
  user_id: job.user_id,
  event_type: 'completed',
  duration_ms: processingTime,
});

4. Emit failed in catch block (line ~355-368):

After const processingTime = Date.now() - startTime; and logger.error('Job processing failed', ...), but BEFORE the try { await ProcessingJobModel.markAsFailed(...) block, add:

// Analytics: job failed (fire-and-forget, void return)
// Guard with job check — job is null if findById failed before assignment
if (job) {
  recordProcessingEvent({
    document_id: job.document_id,
    user_id: job.user_id,
    event_type: 'failed',
    duration_ms: processingTime,
    error_message: errorMessage,
  });
}

Critical constraints:

  • recordProcessingEvent returns void (not Promise<void>) — do NOT use await. This is the fire-and-forget guarantee (PITFALL-6, STATE.md decision).
  • Do NOT wrap in try/catch — the function internally catches all errors and logs them.
  • Do NOT modify any existing code around the instrumentation points — add lines, don't change lines.
  • Guard job in catch block — it can be null if findById threw before assignment.
  • Use event_type: 'upload_started' (not 'processing_started') — per locked decision, key milestones only: upload started, processing complete, processing failed. cd /home/jonathan/Coding/cim_summary/backend && npx tsc --noEmit 2>&1 | head -30 && npx vitest run --reporter=verbose 2>&1 | tail -20 Verify 3 recordProcessingEvent calls exist in jobProcessorService.ts, none use await processJob() emits upload_started after markAsProcessing, completed with duration after markAsCompleted, and failed with duration+error in catch block. All calls are fire-and-forget (no await). Existing processing logic unchanged — no behavior modification.
1. `npx tsc --noEmit` passes with no errors 2. `npx vitest run` — all existing tests pass (no regressions) 3. `grep -c 'recordProcessingEvent' backend/src/services/jobProcessorService.ts` returns 3 4. `grep 'await recordProcessingEvent' backend/src/services/jobProcessorService.ts` returns nothing (no accidental await) 5. `recordProcessingEvent` import exists at top of file

<success_criteria>

  • TypeScript compiles without errors
  • All existing tests pass (zero regressions)
  • Three recordProcessingEvent calls at correct lifecycle points
  • No await on recordProcessingEvent (fire-and-forget preserved)
  • job null-guard in catch block prevents runtime errors
  • No changes to existing processing logic </success_criteria>
After completion, create `.planning/phases/03-api-layer/03-02-SUMMARY.md`