feat: Complete implementation of Tasks 1-5 - CIM Document Processor

Backend Infrastructure:
- Complete Express server setup with security middleware (helmet, CORS, rate limiting)
- Comprehensive error handling and logging with Winston
- Authentication system with JWT tokens and session management
- Database models and migrations for Users, Documents, Feedback, and Processing Jobs
- API routes structure for authentication and document management
- Integration tests for all server components (86 tests passing)

Frontend Infrastructure:
- React application with TypeScript and Vite
- Authentication UI with login form, protected routes, and logout functionality
- Authentication context with proper async state management
- Component tests with proper async handling (25 tests passing)
- Tailwind CSS styling and responsive design

Key Features:
- User registration, login, and authentication
- Protected routes with role-based access control
- Comprehensive error handling and user feedback
- Database schema with proper relationships
- Security middleware and validation
- Production-ready build configuration

Test Coverage: 111/111 tests passing
Tasks Completed: 1-5 (Project setup, Database, Auth system, Frontend UI, Backend infrastructure)

Ready for Task 6: File upload backend infrastructure
This commit is contained in:
Jon
2025-07-27 13:29:26 -04:00
commit 5a3c961bfc
72 changed files with 24326 additions and 0 deletions

115
.gitignore vendored Normal file
View File

@@ -0,0 +1,115 @@
# Dependencies
node_modules/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Build outputs
dist/
build/
*.tsbuildinfo
# Environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Logs
logs/
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Runtime data
pids/
*.pid
*.seed
*.pid.lock
# Coverage directory used by tools like istanbul
coverage/
*.lcov
# nyc test coverage
.nyc_output
# Dependency directories
jspm_packages/
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# parcel-bundler cache (https://parceljs.org/)
.cache
.parcel-cache
# Next.js build output
.next
# Nuxt.js build / generate output
.nuxt
# Gatsby files
.cache/
public
# Storybook build outputs
.out
.storybook-out
# Temporary folders
tmp/
temp/
# Editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
.DS_Store
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?
# OS generated files
Thumbs.db
# Database
*.db
*.sqlite
# Uploads
uploads/
*.pdf
*.doc
*.docx
# Test results
test_results.txt
frontend_test_results.txt
# Kiro specific
.kiro/cache/

View File

@@ -0,0 +1,381 @@
# Design Document
## Overview
The CIM Document Processor is a web-based application that enables authenticated team members to upload large PDF documents (CIMs), have them analyzed by an LLM using a structured template, and download the results in both Markdown and PDF formats. The system follows a modern web architecture with secure authentication, robust file processing, and comprehensive admin oversight.
## Architecture
### High-Level Architecture
```mermaid
graph TB
subgraph "Frontend Layer"
UI[React Web Application]
Auth[Authentication UI]
Upload[File Upload Interface]
Dashboard[User Dashboard]
Admin[Admin Panel]
end
subgraph "Backend Layer"
API[Express.js API Server]
AuthM[Authentication Middleware]
FileH[File Handler Service]
LLMS[LLM Processing Service]
PDF[PDF Generation Service]
end
subgraph "Data Layer"
DB[(PostgreSQL Database)]
FileStore[File Storage (AWS S3/Local)]
Cache[Redis Cache]
end
subgraph "External Services"
LLM[LLM API (OpenAI/Anthropic)]
PDFLib[PDF Processing Library]
end
UI --> API
Auth --> AuthM
Upload --> FileH
Dashboard --> API
Admin --> API
API --> DB
API --> FileStore
API --> Cache
FileH --> FileStore
LLMS --> LLM
PDF --> PDFLib
API --> LLMS
API --> PDF
```
### Technology Stack
**Frontend:**
- React 18 with TypeScript
- Tailwind CSS for styling
- React Router for navigation
- Axios for API communication
- React Query for state management and caching
**Backend:**
- Node.js with Express.js
- TypeScript for type safety
- JWT for authentication
- Multer for file uploads
- Bull Queue for background job processing
**Database:**
- PostgreSQL for primary data storage
- Redis for session management and job queues
**File Processing:**
- PDF-parse for text extraction
- Puppeteer for PDF generation from Markdown
- AWS S3 or local file system for file storage
**LLM Integration:**
- OpenAI API or Anthropic Claude API
- Configurable model selection
- Token management and rate limiting
## Components and Interfaces
### Frontend Components
#### Authentication Components
- `LoginForm`: Handles user login with validation
- `AuthGuard`: Protects routes requiring authentication
- `SessionManager`: Manages user session state
#### Upload Components
- `FileUploader`: Drag-and-drop PDF upload with progress
- `UploadValidator`: Client-side file validation
- `UploadProgress`: Real-time upload status display
#### Dashboard Components
- `DocumentList`: Displays user's uploaded documents
- `DocumentCard`: Individual document status and actions
- `ProcessingStatus`: Real-time processing updates
- `DownloadButtons`: Markdown and PDF download options
#### Admin Components
- `AdminDashboard`: Overview of all system documents
- `UserManagement`: User account management
- `DocumentArchive`: System-wide document access
- `SystemMetrics`: Storage and processing statistics
### Backend Services
#### Authentication Service
```typescript
interface AuthService {
login(credentials: LoginCredentials): Promise<AuthResult>
validateToken(token: string): Promise<User>
logout(userId: string): Promise<void>
refreshToken(refreshToken: string): Promise<AuthResult>
}
```
#### Document Service
```typescript
interface DocumentService {
uploadDocument(file: File, userId: string): Promise<Document>
getDocuments(userId: string): Promise<Document[]>
getDocument(documentId: string): Promise<Document>
deleteDocument(documentId: string): Promise<void>
updateDocumentStatus(documentId: string, status: ProcessingStatus): Promise<void>
}
```
#### LLM Processing Service
```typescript
interface LLMService {
processDocument(documentId: string, extractedText: string): Promise<ProcessingResult>
regenerateWithFeedback(documentId: string, feedback: string): Promise<ProcessingResult>
validateOutput(output: string): Promise<ValidationResult>
}
```
#### PDF Service
```typescript
interface PDFService {
extractText(filePath: string): Promise<string>
generatePDF(markdown: string): Promise<Buffer>
validatePDF(filePath: string): Promise<boolean>
}
```
## Data Models
### User Model
```typescript
interface User {
id: string
email: string
name: string
role: 'user' | 'admin'
createdAt: Date
updatedAt: Date
}
```
### Document Model
```typescript
interface Document {
id: string
userId: string
originalFileName: string
filePath: string
fileSize: number
uploadedAt: Date
status: ProcessingStatus
extractedText?: string
generatedSummary?: string
summaryMarkdownPath?: string
summaryPdfPath?: string
processingStartedAt?: Date
processingCompletedAt?: Date
errorMessage?: string
feedback?: DocumentFeedback[]
versions: DocumentVersion[]
}
type ProcessingStatus =
| 'uploaded'
| 'extracting_text'
| 'processing_llm'
| 'generating_pdf'
| 'completed'
| 'failed'
```
### Document Feedback Model
```typescript
interface DocumentFeedback {
id: string
documentId: string
userId: string
feedback: string
regenerationInstructions?: string
createdAt: Date
}
```
### Document Version Model
```typescript
interface DocumentVersion {
id: string
documentId: string
versionNumber: number
summaryMarkdown: string
summaryPdfPath: string
createdAt: Date
feedback?: string
}
```
### Processing Job Model
```typescript
interface ProcessingJob {
id: string
documentId: string
type: 'text_extraction' | 'llm_processing' | 'pdf_generation'
status: 'pending' | 'processing' | 'completed' | 'failed'
progress: number
errorMessage?: string
createdAt: Date
startedAt?: Date
completedAt?: Date
}
```
## Error Handling
### Frontend Error Handling
- Global error boundary for React components
- Toast notifications for user-facing errors
- Retry mechanisms for failed API calls
- Graceful degradation for offline scenarios
### Backend Error Handling
- Centralized error middleware
- Structured error logging with Winston
- Error categorization (validation, processing, system)
- Automatic retry for transient failures
### File Processing Error Handling
- PDF validation before processing
- Text extraction fallback mechanisms
- LLM API timeout and retry logic
- Cleanup of failed uploads and partial processing
### Error Types
```typescript
enum ErrorType {
VALIDATION_ERROR = 'validation_error',
AUTHENTICATION_ERROR = 'authentication_error',
FILE_PROCESSING_ERROR = 'file_processing_error',
LLM_PROCESSING_ERROR = 'llm_processing_error',
STORAGE_ERROR = 'storage_error',
SYSTEM_ERROR = 'system_error'
}
```
## Testing Strategy
### Unit Testing
- Jest for JavaScript/TypeScript testing
- React Testing Library for component testing
- Supertest for API endpoint testing
- Mock LLM API responses for consistent testing
### Integration Testing
- Database integration tests with test containers
- File upload and processing workflow tests
- Authentication flow testing
- PDF generation and download testing
### End-to-End Testing
- Playwright for browser automation
- Complete user workflows (upload → process → download)
- Admin functionality testing
- Error scenario testing
### Performance Testing
- Load testing for file uploads
- LLM processing performance benchmarks
- Database query optimization testing
- Memory usage monitoring during PDF processing
### Security Testing
- Authentication and authorization testing
- File upload security validation
- SQL injection prevention testing
- XSS and CSRF protection verification
## LLM Integration Design
### Prompt Engineering
The system will use a two-part prompt structure:
**Part 1: CIM Data Extraction**
- Provide the BPCP CIM Review Template
- Instruct LLM to populate only from CIM content
- Use "Not specified in CIM" for missing information
- Maintain strict markdown formatting
**Part 2: Investment Analysis**
- Add "Key Investment Considerations & Diligence Areas" section
- Allow use of general industry knowledge
- Focus on investment-specific insights and risks
### Token Management
- Document chunking for large PDFs (>100 pages)
- Token counting and optimization
- Fallback to smaller context windows if needed
- Cost tracking and monitoring
### Output Validation
- Markdown syntax validation
- Template structure verification
- Content completeness checking
- Retry mechanism for malformed outputs
## Security Considerations
### Authentication & Authorization
- JWT tokens with short expiration times
- Refresh token rotation
- Role-based access control (user/admin)
- Session management with Redis
### File Security
- File type validation (PDF only)
- File size limits (100MB max)
- Virus scanning integration
- Secure file storage with access controls
### Data Protection
- Encryption at rest for sensitive documents
- HTTPS enforcement for all communications
- Input sanitization and validation
- Audit logging for admin actions
### API Security
- Rate limiting on all endpoints
- CORS configuration
- Request size limits
- API key management for LLM services
## Performance Optimization
### File Processing
- Asynchronous processing with job queues
- Progress tracking and status updates
- Parallel processing for multiple documents
- Efficient PDF text extraction
### Database Optimization
- Proper indexing on frequently queried fields
- Connection pooling
- Query optimization
- Database migrations management
### Caching Strategy
- Redis caching for user sessions
- Document metadata caching
- LLM response caching for similar content
- Static asset caching
### Scalability Considerations
- Horizontal scaling capability
- Load balancing for multiple instances
- Database read replicas
- CDN for static assets and downloads

View File

@@ -0,0 +1,130 @@
# Requirements Document
## Introduction
This feature enables team members to upload CIM (Confidential Information Memorandum) documents through a secure web interface, have them analyzed by an LLM for detailed review, and receive structured summaries in both Markdown and PDF formats. The system provides authentication, document processing, and downloadable outputs following a specific template format.
## Requirements
### Requirement 1
**User Story:** As a team member, I want to securely log into the website, so that I can access the CIM processing functionality with proper authentication.
#### Acceptance Criteria
1. WHEN a user visits the website THEN the system SHALL display a login page
2. WHEN a user enters valid credentials THEN the system SHALL authenticate them and redirect to the main dashboard
3. WHEN a user enters invalid credentials THEN the system SHALL display an error message and remain on the login page
4. WHEN a user is not authenticated THEN the system SHALL redirect them to the login page for any protected routes
5. WHEN a user logs out THEN the system SHALL clear their session and redirect to the login page
### Requirement 2
**User Story:** As an authenticated team member, I want to upload CIM PDF documents (75-100+ pages), so that I can have them processed and analyzed.
#### Acceptance Criteria
1. WHEN a user accesses the upload interface THEN the system SHALL display a file upload component
2. WHEN a user selects a PDF file THEN the system SHALL validate it is a PDF format
3. WHEN a user uploads a file larger than 100MB THEN the system SHALL reject it with an appropriate error message
4. WHEN a user uploads a non-PDF file THEN the system SHALL reject it with an appropriate error message
5. WHEN a valid PDF is uploaded THEN the system SHALL store it securely and initiate processing
6. WHEN upload is in progress THEN the system SHALL display upload progress to the user
### Requirement 3
**User Story:** As a team member, I want the uploaded CIM to be reviewed in detail by an LLM using a two-part analysis process, so that I can get both structured data extraction and expert investment analysis.
#### Acceptance Criteria
1. WHEN a CIM document is uploaded THEN the system SHALL extract text content from the PDF
2. WHEN text extraction is complete THEN the system SHALL send the content to an LLM with the predefined analysis prompt
3. WHEN LLM processing begins THEN the system SHALL execute Part 1 (CIM Data Extraction) using only information from the CIM text
4. WHEN Part 1 is complete THEN the system SHALL execute Part 2 (Analyst Diligence Questions) using both CIM content and general industry knowledge
5. WHEN LLM processing is in progress THEN the system SHALL display processing status to the user
6. WHEN LLM analysis fails THEN the system SHALL log the error and notify the user
7. WHEN LLM analysis is complete THEN the system SHALL store both the populated template and diligence analysis results
8. IF the document is too large for single LLM processing THEN the system SHALL chunk it appropriately and process in segments
### Requirement 4
**User Story:** As a team member, I want the LLM to populate the predefined BPCP CIM Review Template with extracted data and include investment diligence analysis, so that I receive consistent and structured summaries following our established format.
#### Acceptance Criteria
1. WHEN LLM processing begins THEN the system SHALL provide both the CIM text and the BPCP CIM Review Template to the LLM
2. WHEN executing Part 1 THEN the system SHALL ensure the LLM populates all template sections (A-G) using only CIM-sourced information
3. WHEN template fields cannot be populated from CIM THEN the system SHALL ensure "Not specified in CIM" is entered
4. WHEN executing Part 2 THEN the system SHALL ensure the LLM adds a "Key Investment Considerations & Diligence Areas" section
5. WHEN LLM processing is complete THEN the system SHALL validate the output maintains proper markdown formatting and template structure
6. WHEN template validation fails THEN the system SHALL log the error and retry the LLM processing
7. WHEN the populated template is ready THEN the system SHALL store it as the final markdown summary
### Requirement 5
**User Story:** As a team member, I want to download the CIM summary in both Markdown and PDF formats, so that I can use the analysis in different contexts and share it appropriately.
#### Acceptance Criteria
1. WHEN a CIM summary is ready THEN the system SHALL provide download links for both MD and PDF formats
2. WHEN a user clicks the Markdown download THEN the system SHALL serve the .md file for download
3. WHEN a user clicks the PDF download THEN the system SHALL convert the markdown to PDF and serve it for download
4. WHEN PDF conversion is in progress THEN the system SHALL display conversion status
5. WHEN PDF conversion fails THEN the system SHALL log the error and notify the user
6. WHEN downloads are requested THEN the system SHALL ensure proper file naming with timestamps
### Requirement 6
**User Story:** As a team member, I want to view the processing status and history of my uploaded CIMs, so that I can track progress and access previous analyses.
#### Acceptance Criteria
1. WHEN a user accesses the dashboard THEN the system SHALL display a list of their uploaded documents
2. WHEN viewing document history THEN the system SHALL show upload date, processing status, and completion status
3. WHEN a document is processing THEN the system SHALL display real-time status updates
4. WHEN a document processing is complete THEN the system SHALL show download options
5. WHEN a document processing fails THEN the system SHALL display error information and retry options
6. WHEN viewing document details THEN the system SHALL show file name, size, and processing timestamps
### Requirement 7
**User Story:** As a team member, I want to provide feedback on generated summaries and request regeneration with specific instructions, so that I can get summaries that better meet my needs.
#### Acceptance Criteria
1. WHEN viewing a completed summary THEN the system SHALL provide a feedback interface for user comments
2. WHEN a user submits feedback THEN the system SHALL store the commentary with the document record
3. WHEN a user requests summary regeneration THEN the system SHALL provide a text field for specific instructions
4. WHEN regeneration is requested THEN the system SHALL reprocess the document using the original content plus user instructions
5. WHEN regeneration is complete THEN the system SHALL replace the previous summary with the new version
6. WHEN multiple regenerations occur THEN the system SHALL maintain a history of previous versions
7. WHEN viewing summary history THEN the system SHALL show timestamps and user feedback for each version
### Requirement 8
**User Story:** As a system administrator, I want to view and manage all uploaded PDF files and summary files from all users, so that I can maintain an archive and have oversight of all processed documents.
#### Acceptance Criteria
1. WHEN an administrator accesses the admin dashboard THEN the system SHALL display all uploaded documents from all users
2. WHEN viewing the admin archive THEN the system SHALL show document details including uploader, upload date, and processing status
3. WHEN an administrator selects a document THEN the system SHALL provide access to both original PDF and generated summaries
4. WHEN an administrator downloads files THEN the system SHALL log the admin access for audit purposes
5. WHEN viewing user documents THEN the system SHALL display user information alongside document metadata
6. WHEN searching the archive THEN the system SHALL allow filtering by user, date range, and processing status
7. WHEN an administrator deletes a document THEN the system SHALL remove both the original PDF and all generated summaries
8. WHEN an administrator confirms deletion THEN the system SHALL log the deletion action for audit purposes
9. WHEN files are deleted THEN the system SHALL free up storage space and update storage metrics
### Requirement 9
**User Story:** As a system administrator, I want the application to handle errors gracefully and maintain security, so that the system remains stable and user data is protected.
#### Acceptance Criteria
1. WHEN any system error occurs THEN the system SHALL log detailed error information
2. WHEN file uploads fail THEN the system SHALL clean up any partial uploads
3. WHEN LLM processing fails THEN the system SHALL retry up to 3 times before marking as failed
4. WHEN user sessions expire THEN the system SHALL redirect to login without data loss
5. WHEN unauthorized access is attempted THEN the system SHALL log the attempt and deny access
6. WHEN sensitive data is processed THEN the system SHALL ensure encryption at rest and in transit

View File

@@ -0,0 +1,197 @@
# Implementation Plan
- [x] 1. Set up project structure and core configuration
- Create directory structure for frontend (React) and backend (Node.js/Express)
- Initialize package.json files with required dependencies
- Set up TypeScript configuration for both frontend and backend
- Configure build tools (Vite for frontend, ts-node for backend)
- Create environment configuration files and validation
- _Requirements: All requirements depend on proper project setup_
- [x] 2. Implement database schema and models
- Set up PostgreSQL database connection and configuration
- Create database migration files for User, Document, DocumentFeedback, DocumentVersion, and ProcessingJob tables
- Implement TypeScript interfaces and database models using an ORM (Prisma or TypeORM)
- Create database seeding scripts for development
- Write unit tests for database models and relationships
- _Requirements: 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, 1.9_
- [x] 3. Build authentication system
- Implement JWT token generation and validation utilities
- Create user registration and login API endpoints
- Build password hashing and validation functions
- Implement session management with Redis integration
- Create authentication middleware for protected routes
- Write unit tests for authentication functions and middleware
- _Requirements: 1.1, 1.2, 1.3, 1.4, 1.5_
- [x] 4. Create basic frontend authentication UI
- Build login form component with validation
- Implement authentication context and state management
- Create protected route wrapper component
- Build logout functionality
- Add error handling and user feedback for authentication
- Write component tests for authentication UI
- _Requirements: 1.1, 1.2, 1.3, 1.4, 1.5_
- [x] 5. Create main backend server and API infrastructure
- Create main Express server entry point (index.ts)
- Set up middleware (CORS, helmet, morgan, rate limiting)
- Configure route mounting and error handling
- Create document upload API endpoints structure
- Set up basic API response formatting
- Write integration tests for server setup
- _Requirements: 2.1, 2.2, 2.3, 2.4, 2.5, 2.6_
- [ ] 6. Implement file upload backend infrastructure
- Set up multer middleware for file uploads with validation
- Create file storage service (local filesystem or AWS S3)
- Implement PDF file validation (type, size, format)
- Build file cleanup utilities for failed uploads
- Create upload progress tracking system
- Write unit tests for file upload validation and storage
- _Requirements: 2.1, 2.2, 2.3, 2.4, 2.5, 2.6_
- [ ] 7. Build file upload frontend interface
- Create drag-and-drop file upload component
- Implement upload progress display with real-time updates
- Add file validation feedback and error messages
- Build upload success confirmation and next steps UI
- Integrate with backend upload API endpoints
- Write component tests for upload functionality
- _Requirements: 2.1, 2.2, 2.3, 2.4, 2.5, 2.6_
- [ ] 8. Implement PDF text extraction service
- Install and configure PDF parsing library (pdf-parse)
- Create text extraction service with error handling
- Implement text chunking for large documents
- Add text quality validation and cleanup
- Create extraction progress tracking
- Write unit tests for text extraction with sample PDFs
- _Requirements: 3.1, 3.8_
- [ ] 9. Set up job queue system for background processing
- Configure Bull queue with Redis backend
- Create job types for text extraction, LLM processing, and PDF generation
- Implement job progress tracking and status updates
- Build job retry logic with exponential backoff
- Create job monitoring and cleanup utilities
- Write unit tests for job queue functionality
- _Requirements: 3.5, 3.6, 3.8_
- [ ] 10. Implement LLM integration service
- Set up OpenAI or Anthropic API client with configuration
- Create prompt templates for Part 1 (CIM Data Extraction) and Part 2 (Investment Analysis)
- Implement token counting and document chunking logic
- Build LLM response validation and retry mechanisms
- Create cost tracking and rate limiting
- Write unit tests with mocked LLM responses
- _Requirements: 3.2, 3.3, 3.4, 3.6, 3.7, 3.8, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6, 4.7_
- [ ] 11. Build document processing workflow orchestration
- Create main processing service that coordinates all steps
- Implement workflow: upload → text extraction → LLM processing → storage
- Add error handling and recovery for each processing step
- Build processing status updates and user notifications
- Create processing history and audit logging
- Write integration tests for complete processing workflow
- _Requirements: 3.1, 3.2, 3.3, 3.4, 3.5, 3.6, 3.7, 3.8_
- [ ] 12. Implement markdown to PDF conversion service
- Set up Puppeteer for PDF generation from markdown
- Create PDF styling and formatting templates
- Implement PDF generation with proper error handling
- Add PDF file naming with timestamps
- Create PDF validation and quality checks
- Write unit tests for PDF generation with sample markdown
- _Requirements: 5.3, 5.4, 5.5, 5.6_
- [ ] 13. Build document download API endpoints
- Create API endpoints for markdown and PDF downloads
- Implement secure file serving with authentication checks
- Add download logging and audit trails
- Build file streaming for large PDF downloads
- Create download error handling and user feedback
- Write unit tests for download endpoints
- _Requirements: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6_
- [ ] 14. Create user dashboard frontend
- Build document list component with status indicators
- Implement real-time processing status updates using WebSockets or polling
- Create document detail view with metadata display
- Add download buttons for completed documents
- Build error display and retry functionality
- Write component tests for dashboard functionality
- _Requirements: 6.1, 6.2, 6.3, 6.4, 6.5, 6.6_
- [ ] 15. Implement feedback and regeneration system
- Create feedback submission API endpoints
- Build feedback storage and retrieval functionality
- Implement document regeneration with user instructions
- Create version history tracking and management
- Add regeneration progress tracking and notifications
- Write unit tests for feedback and regeneration features
- _Requirements: 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 7.7_
- [ ] 16. Build feedback and regeneration UI
- Create feedback form component with text input
- Implement regeneration request interface
- Build version history display and navigation
- Add regeneration progress indicators
- Create comparison view for different versions
- Write component tests for feedback UI
- _Requirements: 7.1, 7.2, 7.3, 7.4, 7.5, 7.6, 7.7_
- [ ] 17. Implement admin dashboard backend
- Create admin-only API endpoints for system overview
- Build user management functionality (view, disable users)
- Implement system-wide document access and management
- Create admin audit logging and activity tracking
- Add storage metrics and system health monitoring
- Write unit tests for admin functionality
- _Requirements: 8.1, 8.2, 8.3, 8.4, 8.5, 8.6, 8.7, 8.8, 8.9_
- [ ] 18. Build admin dashboard frontend
- Create admin panel with user and document overview
- Implement document search and filtering functionality
- Build user management interface
- Add system metrics and storage usage displays
- Create admin action confirmation dialogs
- Write component tests for admin UI
- _Requirements: 8.1, 8.2, 8.3, 8.4, 8.5, 8.6, 8.7, 8.8, 8.9_
- [ ] 19. Implement comprehensive error handling and logging
- Set up Winston logging with different log levels
- Create centralized error handling middleware
- Implement error categorization and user-friendly messages
- Add error recovery and retry mechanisms
- Create error monitoring and alerting system
- Write tests for error scenarios and recovery
- _Requirements: 9.1, 9.2, 9.3, 9.4, 9.5, 9.6_
- [ ] 20. Add security hardening and validation
- Implement input sanitization and validation middleware
- Add rate limiting to all API endpoints
- Create file security scanning integration
- Implement CORS and security headers
- Add audit logging for sensitive operations
- Write security tests and penetration testing scenarios
- _Requirements: 9.4, 9.5, 9.6_
- [ ] 21. Create comprehensive test suite
- Write integration tests for complete user workflows
- Create end-to-end tests using Playwright
- Implement performance tests for file processing
- Add load testing for concurrent uploads
- Create test data fixtures and mock services
- Set up continuous integration test pipeline
- _Requirements: All requirements need comprehensive testing_
- [ ] 22. Build deployment configuration and documentation
- Create Docker containers for frontend and backend
- Set up database migration and seeding scripts
- Create environment-specific configuration files
- Build deployment scripts and CI/CD pipeline
- Write API documentation and user guides
- Create monitoring and health check endpoints
- _Requirements: System deployment supports all requirements_

146
BPCP CIM REVIEW TEMPLATE.md Normal file
View File

@@ -0,0 +1,146 @@
---
**(A) Deal Overview**
- **Purpose:** Capture essential identifying and tracking information for the deal. Allows for quick filtering, context setting, and internal discussion initiation.
- **Worksheet Fields:**
- `Target Company Name:`  
- `Industry/Sector:`  
- `Geography (HQ & Key Operations):`  
- `Deal Source:` - _Provides context on process competitiveness._  
- `Transaction Type:` - _Frames the strategic rationale and diligence focus._  
- `Date CIM Received:` [Enter Date]
- `Date Reviewed:` [Enter Date]
- `Reviewer(s):` [Enter Name(s)]
- `CIM Page Count:` [Enter Number] _(Indicates level of detail provided)_
- `Stated Reason for Sale (if provided):`  
---
**(B) Business Description**
- **Purpose:** Quickly understand the company's core activities, primary offerings, value proposition, and key customer/supplier dynamics. This forms the foundation for subsequent market and competitive assessments.
- **Worksheet Fields:**
- `Core Operations Summary (3-5 sentences):`  
- `Key Products/Services & Revenue Mix (Est. % if available):`  
- `Unique Value Proposition (UVP) / Why Customers Buy:` - _Crucial for assessing defensibility and pricing power._  
- `Customer Base Overview:`
- `Key Customer Segments/Types:`  
- `Customer Concentration Risk (Top 5 and/or Top 10 Customers as % Revenue - if stated/inferable):` - _A critical risk factor in middle-market deals._  
- `Typical Contract Length / Recurring Revenue % (if applicable):`  
- `Key Supplier Overview (if critical & mentioned):`
- `Dependence/Concentration Risk:`  
---
**(C) Market & Industry Analysis**
- **Purpose:** Evaluate the attractiveness of the company's operating environment, including market size, growth trajectory, key trends, competitive intensity, and the company's relative position.
- **Worksheet Fields:**
- `Estimated Market Size (TAM/SAM - if provided):`  
- `Estimated Market Growth Rate (% CAGR - Historical & Projected):` - _Relative growth is a key diagnostic indicator._  
- `Key Industry Trends & Drivers (Tailwinds/Headwinds):` - _Understanding _why_ the market is growing/shrinking is crucial for sustainability assessment._  
- `Competitive Landscape:`
- `Key Competitors Identified:`  
- `Target's Stated Market Position/Rank:`  
- `Basis of Competition:`  
- `Barriers to Entry / Competitive Moat (Stated/Inferred):` - _Assess the sustainability of the company's position._  
---
**(D) Financial Summary**
- **Purpose:** Provide a concise overview of historical financial performance, profitability trends, margins, capital structure, and cash flow characteristics. This section is pivotal for the initial go/no-go decision based on scale, profitability, and preliminary assessment of LBO feasibility.
- **Worksheet Fields:**
- **Table: Key Historical Financials**
- _This table facilitates rapid assessment of financial scale, profitability trends, and margin stability over recent years, addressing core PE screening needs. It distills key quantitative data from potentially lengthy CIM financials for trend analysis and comparison. Analyzing trends over 3+ years is vital , and comparing Revenue growth to EBITDA growth highlights operating leverage or margin pressures._  
|Metric|FY-3 (or earliest avail.)|FY-2|FY-1|LTM (Last Twelve Months)|
|---|---|---|---|---|
|Revenue|[Enter Number]|[Enter Number]|[Enter Number]|[Enter Number]|
|_Revenue Growth (%)_|_N/A_|[Enter %]|[Enter %]|[Enter %]|
|Gross Profit (if avail.)|[Enter Number]|[Enter Number]|[Enter Number]|[Enter Number]|
|_Gross Margin (%)_|[Enter %]|[Enter %]|[Enter %]|[Enter %]|
|EBITDA (Note Adjustments)|[Enter Number]|[Enter Number]|[Enter Number]|[Enter Number]|
|_EBITDA Margin (%)_|[Enter %]|[Enter %]|[Enter %]|[Enter %]|
||||||
 
```
* `Key Financial Notes & Observations:`
* `Quality of Earnings/Adjustments (Initial Impression):` _[11]_
* `Revenue Growth Drivers (Stated):` _[25, 13]_
* `Margin Stability/Trend Analysis:` _[4, 20, 15]_
* `Capital Expenditures (Approx. LTM % of Revenue):` [Enter % - Note if Maintenance vs. Growth CapEx breakdown is provided. How does it compare to D&A?] _[4, 15, 17]_
* `Working Capital Intensity (Impression):` _[4, 15, 17]_
* `Free Cash Flow (FCF) Proxy Quality (Impression):` _[4, 20, 31]_ - *Assessing the interplay between growth, margins, CapEx, and WC is critical to understanding true cash generation for LBOs, which may differ from the CIM's narrative.*
```
---
**(E) Management Team Overview**
- **Purpose:** Form an initial assessment of the leadership team's quality, depth, relevant experience, and potential continuity post-transaction. Management capability is often a decisive factor in middle-market investments.  
- **Worksheet Fields:**
- `Key Leaders Identified (CEO, CFO, COO, Head of Sales, etc.):`  
- `Initial Assessment of Quality/Experience (Based on Bios):` - _Critically evaluate CIM bios, looking for tangible achievements and potential gaps the PE firm might need to fill._  
- `Management's Stated Post-Transaction Role/Intentions (if mentioned):` - _Key indicator of alignment and potential key person risk._  
- `Organizational Structure Overview (Impression):`  
---
**(F) Preliminary Investment Thesis**
- **Purpose:** Synthesize the review findings into an initial, concise articulation of the potential investment's merits, key risks, and the specific ways the PE firm could create value. This step forces a clear articulation of the rationale for proceeding or passing.
- **Worksheet Fields:**
- `Key Attractions / Strengths (Why Invest?):`  
- `Potential Risks / Concerns (Why Not Invest?):`  
- `Initial Value Creation Levers (How PE Adds Value):` _- assess if the general PE playbook and types of businesses attractive to PE align with the key risks and opportunities identified._  
- `Alignment with Fund Strategy` *(BPCP is focused on companies in 5+MM EBITDA range in consumer and industrial end markets. M&A, increased technology & data usage, supply chain and human capital optimization are key value-levers. Also a preference companies which are founder / family-owned and within driving distance of Cleveland and Charlotte. ):*
---
**(G) Key Questions & Next Steps**
- **Purpose:** Document critical outstanding questions prompted by the CIM review and define the immediate next actions required in the deal evaluation process. This ensures accountability and directs subsequent diligence efforts.
- **Worksheet Fields:**
- `Critical Questions Arising from CIM Review:` - _Specific questions indicate a deeper review than generic ones._  
- `Key Missing Information / Areas for Diligence Focus:`  
- `Preliminary Recommendation:`
- `Rationale for Recommendation (Brief):`
- `Proposed Next Steps:`

53
backend/.env.example Normal file
View File

@@ -0,0 +1,53 @@
# Backend Environment Variables
# Server Configuration
PORT=5000
NODE_ENV=development
# Database Configuration
DATABASE_URL=postgresql://username:password@localhost:5432/cim_processor
DB_HOST=localhost
DB_PORT=5432
DB_NAME=cim_processor
DB_USER=username
DB_PASSWORD=password
# Redis Configuration
REDIS_URL=redis://localhost:6379
REDIS_HOST=localhost
REDIS_PORT=6379
# JWT Configuration
JWT_SECRET=your-super-secret-jwt-key-change-this-in-production
JWT_EXPIRES_IN=1h
JWT_REFRESH_SECRET=your-super-secret-refresh-key-change-this-in-production
JWT_REFRESH_EXPIRES_IN=7d
# File Upload Configuration
MAX_FILE_SIZE=104857600
UPLOAD_DIR=uploads
ALLOWED_FILE_TYPES=application/pdf
# LLM Configuration
LLM_PROVIDER=openai
OPENAI_API_KEY=your-openai-api-key
ANTHROPIC_API_KEY=your-anthropic-api-key
LLM_MODEL=gpt-4
LLM_MAX_TOKENS=4000
LLM_TEMPERATURE=0.1
# Storage Configuration
STORAGE_TYPE=local
AWS_ACCESS_KEY_ID=your-aws-access-key
AWS_SECRET_ACCESS_KEY=your-aws-secret-key
AWS_REGION=us-east-1
AWS_S3_BUCKET=cim-processor-files
# Security Configuration
BCRYPT_ROUNDS=12
RATE_LIMIT_WINDOW_MS=900000
RATE_LIMIT_MAX_REQUESTS=100
# Logging Configuration
LOG_LEVEL=info
LOG_FILE=logs/app.log

9
backend/.env.test Normal file
View File

@@ -0,0 +1,9 @@
NODE_ENV=test
LLM_PROVIDER=anthropic
ANTHROPIC_API_KEY=dummy_key
DATABASE_URL=postgresql://test:test@localhost:5432/test_db
DB_NAME=test_db
DB_USER=test
DB_PASSWORD=test
JWT_SECRET=test_jwt_secret
JWT_REFRESH_SECRET=test_jwt_refresh_secret

224
backend/DATABASE.md Normal file
View File

@@ -0,0 +1,224 @@
# Database Setup and Management
This document describes the database setup, migrations, and management for the CIM Document Processor backend.
## Database Schema
The application uses PostgreSQL with the following tables:
### Users Table
- `id` (UUID, Primary Key)
- `email` (VARCHAR, Unique)
- `name` (VARCHAR)
- `password_hash` (VARCHAR)
- `role` (VARCHAR, 'user' or 'admin')
- `created_at` (TIMESTAMP)
- `updated_at` (TIMESTAMP)
- `last_login` (TIMESTAMP, nullable)
- `is_active` (BOOLEAN)
### Documents Table
- `id` (UUID, Primary Key)
- `user_id` (UUID, Foreign Key to users.id)
- `original_file_name` (VARCHAR)
- `file_path` (VARCHAR)
- `file_size` (BIGINT)
- `uploaded_at` (TIMESTAMP)
- `status` (VARCHAR, processing status)
- `extracted_text` (TEXT, nullable)
- `generated_summary` (TEXT, nullable)
- `summary_markdown_path` (VARCHAR, nullable)
- `summary_pdf_path` (VARCHAR, nullable)
- `processing_started_at` (TIMESTAMP, nullable)
- `processing_completed_at` (TIMESTAMP, nullable)
- `error_message` (TEXT, nullable)
- `created_at` (TIMESTAMP)
- `updated_at` (TIMESTAMP)
### Document Feedback Table
- `id` (UUID, Primary Key)
- `document_id` (UUID, Foreign Key to documents.id)
- `user_id` (UUID, Foreign Key to users.id)
- `feedback` (TEXT)
- `regeneration_instructions` (TEXT, nullable)
- `created_at` (TIMESTAMP)
### Document Versions Table
- `id` (UUID, Primary Key)
- `document_id` (UUID, Foreign Key to documents.id)
- `version_number` (INTEGER)
- `summary_markdown` (TEXT)
- `summary_pdf_path` (VARCHAR)
- `feedback` (TEXT, nullable)
- `created_at` (TIMESTAMP)
### Processing Jobs Table
- `id` (UUID, Primary Key)
- `document_id` (UUID, Foreign Key to documents.id)
- `type` (VARCHAR, job type)
- `status` (VARCHAR, job status)
- `progress` (INTEGER, 0-100)
- `error_message` (TEXT, nullable)
- `created_at` (TIMESTAMP)
- `started_at` (TIMESTAMP, nullable)
- `completed_at` (TIMESTAMP, nullable)
## Setup Instructions
### 1. Install Dependencies
```bash
npm install
```
### 2. Configure Environment Variables
Copy the example environment file and configure your database settings:
```bash
cp .env.example .env
```
Update the following variables in `.env`:
- `DATABASE_URL` - PostgreSQL connection string
- `DB_HOST`, `DB_PORT`, `DB_NAME`, `DB_USER`, `DB_PASSWORD` - Database credentials
### 3. Create Database
Create a PostgreSQL database:
```sql
CREATE DATABASE cim_processor;
```
### 4. Run Migrations and Seed Data
```bash
npm run db:setup
```
This command will:
- Run all database migrations to create tables
- Seed the database with initial test data
## Available Scripts
### Database Management
- `npm run db:migrate` - Run database migrations
- `npm run db:seed` - Seed database with test data
- `npm run db:setup` - Run migrations and seed data
### Development
- `npm run dev` - Start development server
- `npm run build` - Build for production
- `npm run test` - Run tests
- `npm run lint` - Run linting
## Database Models
The application includes the following models:
### UserModel
- `create(userData)` - Create new user
- `findById(id)` - Find user by ID
- `findByEmail(email)` - Find user by email
- `findAll(limit, offset)` - Get all users (admin)
- `update(id, updates)` - Update user
- `delete(id)` - Soft delete user
- `emailExists(email)` - Check if email exists
- `count()` - Count total users
### DocumentModel
- `create(documentData)` - Create new document
- `findById(id)` - Find document by ID
- `findByUserId(userId, limit, offset)` - Get user's documents
- `findAll(limit, offset)` - Get all documents (admin)
- `updateStatus(id, status)` - Update document status
- `updateExtractedText(id, text)` - Update extracted text
- `updateGeneratedSummary(id, summary, markdownPath, pdfPath)` - Update summary
- `delete(id)` - Delete document
- `countByUser(userId)` - Count user's documents
- `findByStatus(status, limit, offset)` - Get documents by status
### DocumentFeedbackModel
- `create(feedbackData)` - Create new feedback
- `findByDocumentId(documentId)` - Get document feedback
- `findByUserId(userId, limit, offset)` - Get user's feedback
- `update(id, updates)` - Update feedback
- `delete(id)` - Delete feedback
### DocumentVersionModel
- `create(versionData)` - Create new version
- `findByDocumentId(documentId)` - Get document versions
- `findLatestByDocumentId(documentId)` - Get latest version
- `getNextVersionNumber(documentId)` - Get next version number
- `update(id, updates)` - Update version
- `delete(id)` - Delete version
### ProcessingJobModel
- `create(jobData)` - Create new job
- `findByDocumentId(documentId)` - Get document jobs
- `findByType(type, limit, offset)` - Get jobs by type
- `findByStatus(status, limit, offset)` - Get jobs by status
- `findPendingJobs(limit)` - Get pending jobs
- `updateStatus(id, status)` - Update job status
- `updateProgress(id, progress)` - Update job progress
- `delete(id)` - Delete job
## Seeded Data
The database is seeded with the following test data:
### Users
- `admin@example.com` / `admin123` (Admin role)
- `user1@example.com` / `user123` (User role)
- `user2@example.com` / `user123` (User role)
### Sample Documents
- Sample CIM documents with different processing statuses
- Associated processing jobs for testing
## Indexes
The following indexes are created for optimal performance:
### Users Table
- `idx_users_email` - Email lookups
- `idx_users_role` - Role-based queries
- `idx_users_is_active` - Active user filtering
### Documents Table
- `idx_documents_user_id` - User document queries
- `idx_documents_status` - Status-based queries
- `idx_documents_uploaded_at` - Date-based queries
- `idx_documents_user_status` - Composite index for user + status
### Other Tables
- Foreign key indexes on all relationship columns
- Composite indexes for common query patterns
## Triggers
- `update_users_updated_at` - Automatically updates `updated_at` timestamp on user updates
- `update_documents_updated_at` - Automatically updates `updated_at` timestamp on document updates
## Backup and Recovery
### Backup
```bash
pg_dump -h localhost -U username -d cim_processor > backup.sql
```
### Restore
```bash
psql -h localhost -U username -d cim_processor < backup.sql
```
## Troubleshooting
### Common Issues
1. **Connection refused**: Check database credentials and ensure PostgreSQL is running
2. **Permission denied**: Ensure database user has proper permissions
3. **Migration errors**: Check if migrations table exists and is accessible
4. **Seed data errors**: Ensure all required tables exist before seeding
### Logs
Check the application logs for detailed error information:
- Database connection errors
- Migration execution logs
- Seed data creation logs

18
backend/jest.config.js Normal file
View File

@@ -0,0 +1,18 @@
module.exports = {
preset: 'ts-jest',
testEnvironment: 'node',
roots: ['<rootDir>/src'],
testMatch: ['**/__tests__/**/*.ts', '**/?(*.)+(spec|test).ts'],
transform: {
'^.+\\.ts$': 'ts-jest',
},
collectCoverageFrom: [
'src/**/*.ts',
'!src/**/*.d.ts',
'!src/index.ts',
],
moduleNameMapper: {
'^@/(.*)$': '<rootDir>/src/$1',
},
setupFilesAfterEnv: ['<rootDir>/src/test/setup.ts'],
};

8778
backend/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

58
backend/package.json Normal file
View File

@@ -0,0 +1,58 @@
{
"name": "cim-processor-backend",
"version": "1.0.0",
"description": "Backend API for CIM Document Processor",
"main": "dist/index.js",
"scripts": {
"dev": "ts-node-dev --respawn --transpile-only src/index.ts",
"build": "tsc",
"start": "node dist/index.js",
"test": "jest --passWithNoTests",
"test:watch": "jest --watch --passWithNoTests",
"lint": "eslint src --ext .ts",
"lint:fix": "eslint src --ext .ts --fix",
"db:migrate": "ts-node src/scripts/setup-database.ts",
"db:seed": "ts-node src/models/seed.ts",
"db:setup": "npm run db:migrate"
},
"dependencies": {
"express": "^4.18.2",
"cors": "^2.8.5",
"helmet": "^7.1.0",
"morgan": "^1.10.0",
"dotenv": "^16.3.1",
"bcryptjs": "^2.4.3",
"jsonwebtoken": "^9.0.2",
"multer": "^1.4.5-lts.1",
"pg": "^8.11.3",
"redis": "^4.6.10",
"bull": "^4.12.0",
"pdf-parse": "^1.1.1",
"puppeteer": "^21.5.2",
"winston": "^3.11.0",
"joi": "^17.11.0",
"express-rate-limit": "^7.1.5",
"express-validator": "^7.0.1"
},
"devDependencies": {
"@types/express": "^4.17.21",
"@types/cors": "^2.8.17",
"@types/morgan": "^1.9.9",
"@types/bcryptjs": "^2.4.6",
"@types/jsonwebtoken": "^9.0.5",
"@types/multer": "^1.4.11",
"@types/pg": "^8.10.7",
"@types/pdf-parse": "^1.1.4",
"@types/node": "^20.9.0",
"@types/jest": "^29.5.8",
"@typescript-eslint/eslint-plugin": "^6.10.0",
"@typescript-eslint/parser": "^6.10.0",
"eslint": "^8.53.0",
"jest": "^29.7.0",
"ts-jest": "^29.1.1",
"ts-node-dev": "^2.0.0",
"typescript": "^5.2.2",
"supertest": "^6.3.3",
"@types/supertest": "^2.0.16"
}
}

View File

@@ -0,0 +1,34 @@
import { Pool, PoolClient } from 'pg';
import { config } from './env';
import logger from '../utils/logger';
// Create connection pool
const pool = new Pool({
host: config.database.host,
port: config.database.port,
database: config.database.name,
user: config.database.user,
password: config.database.password,
max: 20, // Maximum number of clients in the pool
idleTimeoutMillis: 30000, // Close idle clients after 30 seconds
connectionTimeoutMillis: 2000, // Return an error after 2 seconds if connection could not be established
});
// Test database connection
pool.on('connect', (_client: PoolClient) => {
logger.info('Connected to PostgreSQL database');
});
pool.on('error', (err: Error, _client: PoolClient) => {
logger.error('Unexpected error on idle client', err);
process.exit(-1);
});
// Graceful shutdown
process.on('SIGINT', async () => {
logger.info('Shutting down database pool...');
await pool.end();
process.exit(0);
});
export default pool;

160
backend/src/config/env.ts Normal file
View File

@@ -0,0 +1,160 @@
import dotenv from 'dotenv';
import Joi from 'joi';
// Load environment variables
dotenv.config();
// Environment validation schema
const envSchema = Joi.object({
NODE_ENV: Joi.string().valid('development', 'production', 'test').default('development'),
PORT: Joi.number().default(5000),
// Database
DATABASE_URL: Joi.string().required(),
DB_HOST: Joi.string().default('localhost'),
DB_PORT: Joi.number().default(5432),
DB_NAME: Joi.string().required(),
DB_USER: Joi.string().required(),
DB_PASSWORD: Joi.string().required(),
// Redis
REDIS_URL: Joi.string().default('redis://localhost:6379'),
REDIS_HOST: Joi.string().default('localhost'),
REDIS_PORT: Joi.number().default(6379),
// JWT
JWT_SECRET: Joi.string().required(),
JWT_EXPIRES_IN: Joi.string().default('1h'),
JWT_REFRESH_SECRET: Joi.string().required(),
JWT_REFRESH_EXPIRES_IN: Joi.string().default('7d'),
// File Upload
MAX_FILE_SIZE: Joi.number().default(104857600), // 100MB
UPLOAD_DIR: Joi.string().default('uploads'),
ALLOWED_FILE_TYPES: Joi.string().default('application/pdf'),
// LLM
LLM_PROVIDER: Joi.string().valid('openai', 'anthropic').default('openai'),
OPENAI_API_KEY: Joi.string().when('LLM_PROVIDER', {
is: 'openai',
then: Joi.required(),
otherwise: Joi.optional()
}),
ANTHROPIC_API_KEY: Joi.string().when('LLM_PROVIDER', {
is: 'anthropic',
then: Joi.required(),
otherwise: Joi.optional()
}),
LLM_MODEL: Joi.string().default('gpt-4'),
LLM_MAX_TOKENS: Joi.number().default(4000),
LLM_TEMPERATURE: Joi.number().min(0).max(2).default(0.1),
// Storage
STORAGE_TYPE: Joi.string().valid('local', 's3').default('local'),
AWS_ACCESS_KEY_ID: Joi.string().when('STORAGE_TYPE', {
is: 's3',
then: Joi.required(),
otherwise: Joi.optional()
}),
AWS_SECRET_ACCESS_KEY: Joi.string().when('STORAGE_TYPE', {
is: 's3',
then: Joi.required(),
otherwise: Joi.optional()
}),
AWS_REGION: Joi.string().when('STORAGE_TYPE', {
is: 's3',
then: Joi.required(),
otherwise: Joi.optional()
}),
AWS_S3_BUCKET: Joi.string().when('STORAGE_TYPE', {
is: 's3',
then: Joi.required(),
otherwise: Joi.optional()
}),
// Security
BCRYPT_ROUNDS: Joi.number().default(12),
RATE_LIMIT_WINDOW_MS: Joi.number().default(900000), // 15 minutes
RATE_LIMIT_MAX_REQUESTS: Joi.number().default(100),
// Logging
LOG_LEVEL: Joi.string().valid('error', 'warn', 'info', 'debug').default('info'),
LOG_FILE: Joi.string().default('logs/app.log'),
}).unknown();
// Validate environment variables
const { error, value: envVars } = envSchema.validate(process.env);
if (error) {
throw new Error(`Config validation error: ${error.message}`);
}
// Export validated configuration
export const config = {
env: envVars.NODE_ENV,
nodeEnv: envVars.NODE_ENV,
port: envVars.PORT,
frontendUrl: process.env['FRONTEND_URL'] || 'http://localhost:3000',
database: {
url: envVars.DATABASE_URL,
host: envVars.DB_HOST,
port: envVars.DB_PORT,
name: envVars.DB_NAME,
user: envVars.DB_USER,
password: envVars.DB_PASSWORD,
},
redis: {
url: envVars.REDIS_URL,
host: envVars.REDIS_HOST,
port: envVars.REDIS_PORT,
},
jwt: {
secret: envVars.JWT_SECRET,
expiresIn: envVars.JWT_EXPIRES_IN,
refreshSecret: envVars.JWT_REFRESH_SECRET,
refreshExpiresIn: envVars.JWT_REFRESH_EXPIRES_IN,
},
upload: {
maxFileSize: envVars.MAX_FILE_SIZE,
uploadDir: envVars.UPLOAD_DIR,
allowedFileTypes: envVars.ALLOWED_FILE_TYPES.split(','),
},
llm: {
provider: envVars.LLM_PROVIDER,
openaiApiKey: envVars.OPENAI_API_KEY,
anthropicApiKey: envVars.ANTHROPIC_API_KEY,
model: envVars.LLM_MODEL,
maxTokens: envVars.LLM_MAX_TOKENS,
temperature: envVars.LLM_TEMPERATURE,
},
storage: {
type: envVars.STORAGE_TYPE,
aws: {
accessKeyId: envVars.AWS_ACCESS_KEY_ID,
secretAccessKey: envVars.AWS_SECRET_ACCESS_KEY,
region: envVars.AWS_REGION,
bucket: envVars.AWS_S3_BUCKET,
},
},
security: {
bcryptRounds: envVars.BCRYPT_ROUNDS,
rateLimit: {
windowMs: envVars.RATE_LIMIT_WINDOW_MS,
maxRequests: envVars.RATE_LIMIT_MAX_REQUESTS,
},
},
logging: {
level: envVars.LOG_LEVEL,
file: envVars.LOG_FILE,
},
};
export default config;

View File

@@ -0,0 +1,593 @@
// Mock dependencies - these must be at the top level
jest.mock('../../models/UserModel');
jest.mock('../../services/sessionService');
jest.mock('../../utils/auth', () => ({
generateAuthTokens: jest.fn(),
verifyRefreshToken: jest.fn(),
hashPassword: jest.fn(),
comparePassword: jest.fn(),
validatePassword: jest.fn()
}));
jest.mock('../../utils/logger', () => ({
info: jest.fn(),
error: jest.fn()
}));
import { Response } from 'express';
import {
register,
login,
logout,
refreshToken,
getProfile,
updateProfile
} from '../authController';
import { UserModel } from '../../models/UserModel';
import { sessionService } from '../../services/sessionService';
import { AuthenticatedRequest } from '../../middleware/auth';
// Import mocked modules
const mockUserModel = UserModel as jest.Mocked<typeof UserModel>;
const mockSessionService = sessionService as jest.Mocked<typeof sessionService>;
const mockAuthUtils = jest.requireMock('../../utils/auth');
describe('Auth Controller', () => {
let mockRequest: Partial<AuthenticatedRequest>;
let mockResponse: Partial<Response>;
beforeEach(() => {
mockRequest = {
body: {},
headers: {}
};
mockResponse = {
status: jest.fn().mockReturnThis(),
json: jest.fn().mockReturnThis()
};
// Reset all mocks
jest.clearAllMocks();
// Setup default mock implementations
mockUserModel.findByEmail.mockResolvedValue(null);
mockUserModel.create.mockResolvedValue({} as any);
mockUserModel.findById.mockResolvedValue({} as any);
mockUserModel.updateLastLogin.mockResolvedValue();
mockAuthUtils.hashPassword.mockResolvedValue('hashed-password');
mockAuthUtils.generateAuthTokens.mockReturnValue({
accessToken: 'access-token',
refreshToken: 'refresh-token',
expiresIn: 3600
});
mockAuthUtils.validatePassword.mockReturnValue({
isValid: true,
errors: []
});
mockSessionService.storeSession.mockResolvedValue();
mockSessionService.removeSession.mockResolvedValue();
mockSessionService.getSession.mockResolvedValue(null);
});
describe('register', () => {
const validUserData = {
email: 'test@example.com',
name: 'Test User',
password: 'StrongPass123!'
};
it('should register a new user successfully', async () => {
mockRequest.body = validUserData;
const mockUser = {
id: 'user-123',
email: validUserData.email,
name: validUserData.name,
role: 'user'
};
const mockTokens = {
accessToken: 'access-token',
refreshToken: 'refresh-token',
expiresIn: 3600
};
mockUserModel.findByEmail.mockResolvedValue(null);
mockUserModel.create.mockResolvedValue(mockUser as any);
mockAuthUtils.hashPassword.mockResolvedValue('hashed-password');
mockAuthUtils.generateAuthTokens.mockReturnValue(mockTokens);
mockSessionService.storeSession.mockResolvedValue();
await register(mockRequest as any, mockResponse as any);
expect(mockUserModel.findByEmail).toHaveBeenCalledWith(validUserData.email);
expect(mockUserModel.create).toHaveBeenCalledWith({
email: validUserData.email,
name: validUserData.name,
password: 'hashed-password',
role: 'user'
});
expect(mockAuthUtils.generateAuthTokens).toHaveBeenCalledWith({
userId: mockUser.id,
email: mockUser.email,
role: mockUser.role
});
expect(mockSessionService.storeSession).toHaveBeenCalled();
expect(mockResponse.status).toHaveBeenCalledWith(201);
expect(mockResponse.json).toHaveBeenCalledWith({
success: true,
message: 'User registered successfully',
data: {
user: {
id: mockUser.id,
email: mockUser.email,
name: mockUser.name,
role: mockUser.role
},
tokens: mockTokens
}
});
});
it('should return error for missing required fields', async () => {
mockRequest.body = { email: 'test@example.com' };
await register(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(400);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Email, name, and password are required'
});
});
it('should return error for invalid email format', async () => {
mockRequest.body = {
...validUserData,
email: 'invalid-email'
};
await register(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(400);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Invalid email format'
});
});
it('should return error for weak password', async () => {
mockRequest.body = {
...validUserData,
password: 'weak'
};
// Override the default mock to return validation error
mockAuthUtils.validatePassword.mockReturnValue({
isValid: false,
errors: ['Password must be at least 8 characters long']
});
await register(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(400);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Password does not meet requirements',
errors: expect.arrayContaining([
'Password must be at least 8 characters long'
])
});
});
it('should return error for existing user', async () => {
mockRequest.body = validUserData;
const existingUser = { id: 'existing-user' };
mockUserModel.findByEmail.mockResolvedValue(existingUser as any);
await register(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(409);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'User with this email already exists'
});
});
});
describe('login', () => {
const validLoginData = {
email: 'test@example.com',
password: 'StrongPass123!'
};
it('should login user successfully', async () => {
mockRequest.body = validLoginData;
const mockUser = {
id: 'user-123',
email: validLoginData.email,
name: 'Test User',
role: 'user',
is_active: true,
password_hash: 'hashed-password'
};
const mockTokens = {
accessToken: 'access-token',
refreshToken: 'refresh-token',
expiresIn: 3600
};
mockUserModel.findByEmail.mockResolvedValue(mockUser as any);
mockUserModel.updateLastLogin.mockResolvedValue();
mockAuthUtils.generateAuthTokens.mockReturnValue(mockTokens);
mockSessionService.storeSession.mockResolvedValue();
// Mock comparePassword to return true
mockAuthUtils.comparePassword.mockResolvedValue(true);
await login(mockRequest as any, mockResponse as any);
expect(mockUserModel.findByEmail).toHaveBeenCalledWith(validLoginData.email);
expect(mockAuthUtils.generateAuthTokens).toHaveBeenCalledWith({
userId: mockUser.id,
email: mockUser.email,
role: mockUser.role
});
expect(mockSessionService.storeSession).toHaveBeenCalled();
expect(mockUserModel.updateLastLogin).toHaveBeenCalledWith(mockUser.id);
expect(mockResponse.status).toHaveBeenCalledWith(200);
expect(mockResponse.json).toHaveBeenCalledWith({
success: true,
message: 'Login successful',
data: {
user: {
id: mockUser.id,
email: mockUser.email,
name: mockUser.name,
role: mockUser.role
},
tokens: mockTokens
}
});
});
it('should return error for missing credentials', async () => {
mockRequest.body = { email: 'test@example.com' };
await login(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(400);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Email and password are required'
});
});
it('should return error for non-existent user', async () => {
mockRequest.body = validLoginData;
mockUserModel.findByEmail.mockResolvedValue(null);
await login(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(401);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Invalid email or password'
});
});
it('should return error for inactive user', async () => {
mockRequest.body = validLoginData;
const mockUser = {
id: 'user-123',
email: validLoginData.email,
is_active: false
};
mockUserModel.findByEmail.mockResolvedValue(mockUser as any);
await login(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(401);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Account is deactivated'
});
});
it('should return error for incorrect password', async () => {
mockRequest.body = validLoginData;
const mockUser = {
id: 'user-123',
email: validLoginData.email,
is_active: true,
password_hash: 'hashed-password'
};
mockUserModel.findByEmail.mockResolvedValue(mockUser as any);
// Mock comparePassword to return false (incorrect password)
mockAuthUtils.comparePassword.mockResolvedValue(false);
await login(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(401);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Invalid email or password'
});
});
});
describe('logout', () => {
it('should logout user successfully', async () => {
mockRequest.user = {
userId: 'user-123',
email: 'test@example.com',
role: 'user'
};
mockRequest.headers = {
authorization: 'Bearer access-token'
};
mockSessionService.removeSession.mockResolvedValue();
mockUserModel.updateLastLogin.mockResolvedValue();
await logout(mockRequest as any, mockResponse as any);
expect(mockSessionService.removeSession).toHaveBeenCalledWith('user-123');
expect(mockResponse.status).toHaveBeenCalledWith(200);
expect(mockResponse.json).toHaveBeenCalledWith({
success: true,
message: 'Logout successful'
});
});
it('should return error when user is not authenticated', async () => {
await logout(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(401);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Authentication required'
});
});
});
describe('refreshToken', () => {
it('should refresh token successfully', async () => {
mockRequest.body = { refreshToken: 'valid-refresh-token' };
const mockUser = {
id: 'user-123',
email: 'test@example.com',
role: 'user',
is_active: true
};
const mockSession = {
userId: 'user-123',
refreshToken: 'valid-refresh-token'
};
const mockTokens = {
accessToken: 'new-access-token',
refreshToken: 'new-refresh-token',
expiresIn: 3600
};
mockUserModel.findById.mockResolvedValue(mockUser as any);
mockSessionService.getSession.mockResolvedValue(mockSession as any);
mockAuthUtils.generateAuthTokens.mockReturnValue(mockTokens);
mockSessionService.storeSession.mockResolvedValue();
mockSessionService.blacklistToken.mockResolvedValue();
// Mock verifyRefreshToken to return decoded token
mockAuthUtils.verifyRefreshToken.mockReturnValue({
userId: 'user-123',
email: 'test@example.com',
role: 'user'
});
await refreshToken(mockRequest as any, mockResponse as any);
expect(mockUserModel.findById).toHaveBeenCalledWith('user-123');
expect(mockSessionService.getSession).toHaveBeenCalledWith('user-123');
expect(mockAuthUtils.generateAuthTokens).toHaveBeenCalled();
expect(mockSessionService.storeSession).toHaveBeenCalled();
expect(mockSessionService.blacklistToken).toHaveBeenCalledWith('valid-refresh-token', 86400);
expect(mockResponse.status).toHaveBeenCalledWith(200);
expect(mockResponse.json).toHaveBeenCalledWith({
success: true,
message: 'Token refreshed successfully',
data: {
tokens: mockTokens
}
});
});
it('should return error for missing refresh token', async () => {
mockRequest.body = {};
await refreshToken(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(400);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Refresh token is required'
});
});
});
describe('getProfile', () => {
it('should return user profile successfully', async () => {
mockRequest.user = {
userId: 'user-123',
email: 'test@example.com',
role: 'user'
};
const mockUser = {
id: 'user-123',
email: 'test@example.com',
name: 'Test User',
role: 'user',
created_at: new Date(),
last_login: new Date()
};
mockUserModel.findById.mockResolvedValue(mockUser as any);
await getProfile(mockRequest as any, mockResponse as any);
expect(mockUserModel.findById).toHaveBeenCalledWith('user-123');
expect(mockResponse.status).toHaveBeenCalledWith(200);
expect(mockResponse.json).toHaveBeenCalledWith({
success: true,
data: {
user: {
id: mockUser.id,
email: mockUser.email,
name: mockUser.name,
role: mockUser.role,
created_at: mockUser.created_at,
last_login: mockUser.last_login
}
}
});
});
it('should return error when user is not authenticated', async () => {
await getProfile(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(401);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Authentication required'
});
});
it('should return error when user not found', async () => {
mockRequest.user = {
userId: 'user-123',
email: 'test@example.com',
role: 'user'
};
mockUserModel.findById.mockResolvedValue(null);
await getProfile(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(404);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'User not found'
});
});
});
describe('updateProfile', () => {
it('should update user profile successfully', async () => {
mockRequest.user = {
userId: 'user-123',
email: 'test@example.com',
role: 'user'
};
mockRequest.body = {
name: 'Updated Name',
email: 'updated@example.com'
};
const mockUpdatedUser = {
id: 'user-123',
email: 'updated@example.com',
name: 'Updated Name',
role: 'user',
created_at: new Date(),
last_login: new Date()
};
mockUserModel.findByEmail.mockResolvedValue(null);
mockUserModel.update.mockResolvedValue(mockUpdatedUser as any);
await updateProfile(mockRequest as any, mockResponse as any);
expect(mockUserModel.findByEmail).toHaveBeenCalledWith('updated@example.com');
expect(mockUserModel.update).toHaveBeenCalledWith('user-123', {
name: 'Updated Name',
email: 'updated@example.com'
});
expect(mockResponse.status).toHaveBeenCalledWith(200);
expect(mockResponse.json).toHaveBeenCalledWith({
success: true,
message: 'Profile updated successfully',
data: {
user: {
id: mockUpdatedUser.id,
email: mockUpdatedUser.email,
name: mockUpdatedUser.name,
role: mockUpdatedUser.role,
created_at: mockUpdatedUser.created_at,
last_login: mockUpdatedUser.last_login
}
}
});
});
it('should return error when user is not authenticated', async () => {
await updateProfile(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(401);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Authentication required'
});
});
it('should return error for invalid email format', async () => {
mockRequest.user = {
userId: 'user-123',
email: 'test@example.com',
role: 'user'
};
mockRequest.body = {
email: 'invalid-email'
};
await updateProfile(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(400);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Invalid email format'
});
});
it('should return error for email already taken', async () => {
mockRequest.user = {
userId: 'user-123',
email: 'test@example.com',
role: 'user'
};
mockRequest.body = {
email: 'taken@example.com'
};
const existingUser = { id: 'other-user' };
mockUserModel.findByEmail.mockResolvedValue(existingUser as any);
await updateProfile(mockRequest as any, mockResponse as any);
expect(mockResponse.status).toHaveBeenCalledWith(409);
expect(mockResponse.json).toHaveBeenCalledWith({
success: false,
message: 'Email is already taken'
});
});
});
});

View File

@@ -0,0 +1,464 @@
import { Request, Response } from 'express';
import { AuthenticatedRequest } from '../middleware/auth';
import { UserModel } from '../models/UserModel';
import {
generateAuthTokens,
verifyRefreshToken,
hashPassword,
comparePassword,
validatePassword
} from '../utils/auth';
import { sessionService } from '../services/sessionService';
import logger from '../utils/logger';
export interface RegisterRequest extends Request {
body: {
email: string;
name: string;
password: string;
};
}
export interface LoginRequest extends Request {
body: {
email: string;
password: string;
};
}
export interface RefreshTokenRequest extends Request {
body: {
refreshToken: string;
};
}
/**
* Register a new user
*/
export async function register(req: RegisterRequest, res: Response): Promise<void> {
try {
const { email, name, password } = req.body;
// Validate input
if (!email || !name || !password) {
res.status(400).json({
success: false,
message: 'Email, name, and password are required'
});
return;
}
// Validate email format
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
if (!emailRegex.test(email)) {
res.status(400).json({
success: false,
message: 'Invalid email format'
});
return;
}
// Validate password strength
const passwordValidation = validatePassword(password);
if (!passwordValidation.isValid) {
res.status(400).json({
success: false,
message: 'Password does not meet requirements',
errors: passwordValidation.errors
});
return;
}
// Check if user already exists
const existingUser = await UserModel.findByEmail(email);
if (existingUser) {
res.status(409).json({
success: false,
message: 'User with this email already exists'
});
return;
}
// Hash password
const hashedPassword = await hashPassword(password);
// Create user
const user = await UserModel.create({
email,
name,
password: hashedPassword,
role: 'user'
});
// Generate tokens
const tokens = generateAuthTokens({
userId: user.id,
email: user.email,
role: user.role
});
// Store session
await sessionService.storeSession(user.id, {
userId: user.id,
email: user.email,
role: user.role,
refreshToken: tokens.refreshToken
});
logger.info(`New user registered: ${email}`);
res.status(201).json({
success: true,
message: 'User registered successfully',
data: {
user: {
id: user.id,
email: user.email,
name: user.name,
role: user.role
},
tokens: {
accessToken: tokens.accessToken,
refreshToken: tokens.refreshToken,
expiresIn: tokens.expiresIn
}
}
});
} catch (error) {
logger.error('Registration error:', error);
res.status(500).json({
success: false,
message: 'Internal server error during registration'
});
}
}
/**
* Login user
*/
export async function login(req: LoginRequest, res: Response): Promise<void> {
try {
const { email, password } = req.body;
// Validate input
if (!email || !password) {
res.status(400).json({
success: false,
message: 'Email and password are required'
});
return;
}
// Find user by email
const user = await UserModel.findByEmail(email);
if (!user) {
res.status(401).json({
success: false,
message: 'Invalid email or password'
});
return;
}
// Check if user is active
if (!user.is_active) {
res.status(401).json({
success: false,
message: 'Account is deactivated'
});
return;
}
// Verify password
const isPasswordValid = await comparePassword(password, user.password_hash);
if (!isPasswordValid) {
res.status(401).json({
success: false,
message: 'Invalid email or password'
});
return;
}
// Generate tokens
const tokens = generateAuthTokens({
userId: user.id,
email: user.email,
role: user.role
});
// Store session
await sessionService.storeSession(user.id, {
userId: user.id,
email: user.email,
role: user.role,
refreshToken: tokens.refreshToken
});
// Update last login
await UserModel.updateLastLogin(user.id);
logger.info(`User logged in: ${email}`);
res.status(200).json({
success: true,
message: 'Login successful',
data: {
user: {
id: user.id,
email: user.email,
name: user.name,
role: user.role
},
tokens: {
accessToken: tokens.accessToken,
refreshToken: tokens.refreshToken,
expiresIn: tokens.expiresIn
}
}
});
} catch (error) {
logger.error('Login error:', error);
res.status(500).json({
success: false,
message: 'Internal server error during login'
});
}
}
/**
* Logout user
*/
export async function logout(req: AuthenticatedRequest, res: Response): Promise<void> {
try {
if (!req.user) {
res.status(401).json({
success: false,
message: 'Authentication required'
});
return;
}
// Get the token from header for blacklisting
const authHeader = req.headers.authorization;
if (authHeader) {
const token = authHeader.split(' ')[1];
if (token) {
// Blacklist the access token
await sessionService.blacklistToken(token, 3600); // 1 hour
}
}
// Remove session
await sessionService.removeSession(req.user.userId);
logger.info(`User logged out: ${req.user.email}`);
res.status(200).json({
success: true,
message: 'Logout successful'
});
} catch (error) {
logger.error('Logout error:', error);
res.status(500).json({
success: false,
message: 'Internal server error during logout'
});
}
}
/**
* Refresh access token
*/
export async function refreshToken(req: RefreshTokenRequest, res: Response): Promise<void> {
try {
const { refreshToken } = req.body;
if (!refreshToken) {
res.status(400).json({
success: false,
message: 'Refresh token is required'
});
return;
}
// Verify refresh token
const decoded = verifyRefreshToken(refreshToken);
// Check if user exists and is active
const user = await UserModel.findById(decoded.userId);
if (!user || !user.is_active) {
res.status(401).json({
success: false,
message: 'Invalid refresh token'
});
return;
}
// Check if session exists and matches
const session = await sessionService.getSession(decoded.userId);
if (!session || session.refreshToken !== refreshToken) {
res.status(401).json({
success: false,
message: 'Invalid refresh token'
});
return;
}
// Generate new tokens
const tokens = generateAuthTokens({
userId: user.id,
email: user.email,
role: user.role
});
// Update session with new refresh token
await sessionService.storeSession(user.id, {
userId: user.id,
email: user.email,
role: user.role,
refreshToken: tokens.refreshToken
});
// Blacklist old refresh token
await sessionService.blacklistToken(refreshToken, 86400); // 24 hours
logger.info(`Token refreshed for user: ${user.email}`);
res.status(200).json({
success: true,
message: 'Token refreshed successfully',
data: {
tokens: {
accessToken: tokens.accessToken,
refreshToken: tokens.refreshToken,
expiresIn: tokens.expiresIn
}
}
});
} catch (error) {
logger.error('Token refresh error:', error);
res.status(401).json({
success: false,
message: 'Invalid refresh token'
});
}
}
/**
* Get current user profile
*/
export async function getProfile(req: AuthenticatedRequest, res: Response): Promise<void> {
try {
if (!req.user) {
res.status(401).json({
success: false,
message: 'Authentication required'
});
return;
}
const user = await UserModel.findById(req.user.userId);
if (!user) {
res.status(404).json({
success: false,
message: 'User not found'
});
return;
}
res.status(200).json({
success: true,
data: {
user: {
id: user.id,
email: user.email,
name: user.name,
role: user.role,
created_at: user.created_at,
last_login: user.last_login
}
}
});
} catch (error) {
logger.error('Get profile error:', error);
res.status(500).json({
success: false,
message: 'Internal server error'
});
}
}
/**
* Update user profile
*/
export async function updateProfile(req: AuthenticatedRequest, res: Response): Promise<void> {
try {
if (!req.user) {
res.status(401).json({
success: false,
message: 'Authentication required'
});
return;
}
const { name, email } = req.body;
// Validate input
if (email) {
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
if (!emailRegex.test(email)) {
res.status(400).json({
success: false,
message: 'Invalid email format'
});
return;
}
// Check if email is already taken by another user
const existingUser = await UserModel.findByEmail(email);
if (existingUser && existingUser.id !== req.user.userId) {
res.status(409).json({
success: false,
message: 'Email is already taken'
});
return;
}
}
// Update user
const updatedUser = await UserModel.update(req.user.userId, {
name: name || undefined,
email: email || undefined
});
if (!updatedUser) {
res.status(404).json({
success: false,
message: 'User not found'
});
return;
}
logger.info(`Profile updated for user: ${req.user.email}`);
res.status(200).json({
success: true,
message: 'Profile updated successfully',
data: {
user: {
id: updatedUser.id,
email: updatedUser.email,
name: updatedUser.name,
role: updatedUser.role,
created_at: updatedUser.created_at,
last_login: updatedUser.last_login
}
}
});
} catch (error) {
logger.error('Update profile error:', error);
res.status(500).json({
success: false,
message: 'Internal server error'
});
}
}

118
backend/src/index.ts Normal file
View File

@@ -0,0 +1,118 @@
import express from 'express';
import cors from 'cors';
import helmet from 'helmet';
import morgan from 'morgan';
import rateLimit from 'express-rate-limit';
import { config } from './config/env';
import { logger } from './utils/logger';
import authRoutes from './routes/auth';
import documentRoutes from './routes/documents';
import { errorHandler } from './middleware/errorHandler';
import { notFoundHandler } from './middleware/notFoundHandler';
const app = express();
const PORT = config.port || 5000;
// Security middleware
app.use(helmet({
contentSecurityPolicy: {
directives: {
defaultSrc: ["'self'"],
styleSrc: ["'self'", "'unsafe-inline'"],
scriptSrc: ["'self'"],
imgSrc: ["'self'", "data:", "https:"],
},
},
}));
// CORS configuration
app.use(cors({
origin: config.frontendUrl || 'http://localhost:3000',
credentials: true,
methods: ['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'],
allowedHeaders: ['Content-Type', 'Authorization'],
}));
// Rate limiting
const limiter = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // limit each IP to 100 requests per windowMs
message: {
error: 'Too many requests from this IP, please try again later.',
},
standardHeaders: true,
legacyHeaders: false,
});
app.use(limiter);
// Body parsing middleware
app.use(express.json({ limit: '10mb' }));
app.use(express.urlencoded({ extended: true, limit: '10mb' }));
// Logging middleware
app.use(morgan('combined', {
stream: {
write: (message: string) => logger.info(message.trim()),
},
}));
// Health check endpoint
app.get('/health', (_req, res) => {
res.status(200).json({
status: 'ok',
timestamp: new Date().toISOString(),
uptime: process.uptime(),
environment: config.nodeEnv,
});
});
// API routes
app.use('/api/auth', authRoutes);
app.use('/api/documents', documentRoutes);
// API root endpoint
app.get('/api', (_req, res) => {
res.json({
message: 'CIM Document Processor API',
version: '1.0.0',
endpoints: {
auth: '/api/auth',
documents: '/api/documents',
health: '/health',
},
});
});
// 404 handler
app.use(notFoundHandler);
// Global error handler (must be last)
app.use(errorHandler);
// Start server
const server = app.listen(PORT, () => {
logger.info(`🚀 Server running on port ${PORT}`);
logger.info(`📊 Environment: ${config.nodeEnv}`);
logger.info(`🔗 API URL: http://localhost:${PORT}/api`);
logger.info(`🏥 Health check: http://localhost:${PORT}/health`);
});
// Graceful shutdown
process.on('SIGTERM', () => {
logger.info('SIGTERM received, shutting down gracefully');
server.close(() => {
logger.info('Process terminated');
process.exit(0);
});
});
process.on('SIGINT', () => {
logger.info('SIGINT received, shutting down gracefully');
server.close(() => {
logger.info('Process terminated');
process.exit(0);
});
});
export default app;

View File

@@ -0,0 +1,244 @@
import { Request, Response, NextFunction } from 'express';
import { verifyAccessToken, extractTokenFromHeader } from '../utils/auth';
import { sessionService } from '../services/sessionService';
import { UserModel } from '../models/UserModel';
import logger from '../utils/logger';
export interface AuthenticatedRequest extends Request {
user?: {
userId: string;
email: string;
role: string;
};
}
/**
* Authentication middleware to verify JWT tokens
*/
export async function authenticateToken(
req: AuthenticatedRequest,
res: Response,
next: NextFunction
): Promise<void> {
try {
const authHeader = req.headers.authorization;
const token = extractTokenFromHeader(authHeader);
if (!token) {
res.status(401).json({
success: false,
message: 'Access token is required'
});
return;
}
// Check if token is blacklisted
const isBlacklisted = await sessionService.isTokenBlacklisted(token);
if (isBlacklisted) {
res.status(401).json({
success: false,
message: 'Token has been revoked'
});
return;
}
// Verify the token
const decoded = verifyAccessToken(token);
// Check if user still exists and is active
const user = await UserModel.findById(decoded.userId);
if (!user || !user.is_active) {
res.status(401).json({
success: false,
message: 'User account is inactive or does not exist'
});
return;
}
// Check if session exists
const session = await sessionService.getSession(decoded.userId);
if (!session) {
res.status(401).json({
success: false,
message: 'Session expired, please login again'
});
return;
}
// Attach user info to request
req.user = {
userId: decoded.userId,
email: decoded.email,
role: decoded.role
};
logger.info(`Authenticated request for user: ${decoded.email}`);
next();
} catch (error) {
logger.error('Authentication error:', error);
res.status(401).json({
success: false,
message: 'Invalid or expired token'
});
}
}
// Alias for backward compatibility
export const auth = authenticateToken;
/**
* Role-based authorization middleware
*/
export function requireRole(allowedRoles: string[]) {
return (req: AuthenticatedRequest, res: Response, next: NextFunction): void => {
if (!req.user) {
res.status(401).json({
success: false,
message: 'Authentication required'
});
return;
}
if (!allowedRoles.includes(req.user.role)) {
res.status(403).json({
success: false,
message: 'Insufficient permissions'
});
return;
}
logger.info(`Authorized request for user: ${req.user.email} with role: ${req.user.role}`);
next();
};
}
/**
* Admin-only middleware
*/
export function requireAdmin(
req: AuthenticatedRequest,
res: Response,
next: NextFunction
): void {
requireRole(['admin'])(req, res, next);
}
/**
* User or admin middleware
*/
export function requireUserOrAdmin(
req: AuthenticatedRequest,
res: Response,
next: NextFunction
): void {
requireRole(['user', 'admin'])(req, res, next);
}
/**
* Optional authentication middleware (doesn't fail if no token)
*/
export async function optionalAuth(
req: AuthenticatedRequest,
_res: Response,
next: NextFunction
): Promise<void> {
try {
const authHeader = req.headers.authorization;
const token = extractTokenFromHeader(authHeader);
if (!token) {
// No token provided, continue without authentication
next();
return;
}
// Check if token is blacklisted
const isBlacklisted = await sessionService.isTokenBlacklisted(token);
if (isBlacklisted) {
// Token is blacklisted, continue without authentication
next();
return;
}
// Verify the token
const decoded = verifyAccessToken(token);
// Check if user still exists and is active
const user = await UserModel.findById(decoded.userId);
if (!user || !user.is_active) {
// User doesn't exist or is inactive, continue without authentication
next();
return;
}
// Check if session exists
const session = await sessionService.getSession(decoded.userId);
if (!session) {
// Session doesn't exist, continue without authentication
next();
return;
}
// Attach user info to request
req.user = {
userId: decoded.userId,
email: decoded.email,
role: decoded.role
};
logger.info(`Optional authentication successful for user: ${decoded.email}`);
next();
} catch (error) {
// Token verification failed, continue without authentication
logger.debug('Optional authentication failed, continuing without user context');
next();
}
}
/**
* Rate limiting middleware for authentication endpoints
*/
export function authRateLimit(
_req: Request,
_res: Response,
next: NextFunction
): void {
// This would typically integrate with a rate limiting library
// For now, we'll just pass through
// TODO: Implement proper rate limiting
next();
}
/**
* Logout middleware to invalidate session
*/
export async function logout(
req: AuthenticatedRequest,
res: Response,
next: NextFunction
): Promise<void> {
try {
if (!req.user) {
res.status(401).json({
success: false,
message: 'Authentication required'
});
return;
}
// Remove session
await sessionService.removeSession(req.user.userId);
// Update last login in database
await UserModel.updateLastLogin(req.user.userId);
logger.info(`User logged out: ${req.user.email}`);
next();
} catch (error) {
logger.error('Logout error:', error);
res.status(500).json({
success: false,
message: 'Error during logout'
});
}
}

View File

@@ -0,0 +1,66 @@
import { Request, Response, NextFunction } from 'express';
import { logger } from '../utils/logger';
export interface AppError extends Error {
statusCode?: number;
isOperational?: boolean;
}
export const errorHandler = (
err: AppError,
req: Request,
res: Response,
_next: NextFunction
): void => {
let error = { ...err };
error.message = err.message;
// Log error
logger.error('Error occurred:', {
error: err.message,
stack: err.stack,
url: req.url,
method: req.method,
ip: req.ip,
userAgent: req.get('User-Agent'),
});
// Mongoose bad ObjectId
if (err.name === 'CastError') {
const message = 'Resource not found';
error = { message, statusCode: 404 } as AppError;
}
// Mongoose duplicate key
if (err.name === 'MongoError' && (err as any).code === 11000) {
const message = 'Duplicate field value entered';
error = { message, statusCode: 400 } as AppError;
}
// Mongoose validation error
if (err.name === 'ValidationError') {
const message = Object.values((err as any).errors).map((val: any) => val.message).join(', ');
error = { message, statusCode: 400 } as AppError;
}
// JWT errors
if (err.name === 'JsonWebTokenError') {
const message = 'Invalid token';
error = { message, statusCode: 401 } as AppError;
}
if (err.name === 'TokenExpiredError') {
const message = 'Token expired';
error = { message, statusCode: 401 } as AppError;
}
// Default error
const statusCode = error.statusCode || 500;
const message = error.message || 'Server Error';
res.status(statusCode).json({
success: false,
error: message,
...(process.env['NODE_ENV'] === 'development' && { stack: err.stack }),
});
};

View File

@@ -0,0 +1,13 @@
import { Request, Response, NextFunction } from 'express';
export const notFoundHandler = (
req: Request,
res: Response,
_next: NextFunction
): void => {
res.status(404).json({
success: false,
error: `Route ${req.originalUrl} not found`,
message: 'The requested resource does not exist',
});
};

View File

@@ -0,0 +1,75 @@
import { Request, Response, NextFunction } from 'express';
import Joi from 'joi';
// Document upload validation schema
const documentUploadSchema = Joi.object({
title: Joi.string().min(1).max(255).optional(),
description: Joi.string().max(1000).optional(),
});
export const validateDocumentUpload = (
req: Request,
res: Response,
next: NextFunction
): void => {
const { error } = documentUploadSchema.validate(req.body);
if (error) {
res.status(400).json({
success: false,
error: 'Validation failed',
details: error.details.map(detail => detail.message),
});
return;
}
next();
};
// Feedback validation schema
const feedbackSchema = Joi.object({
feedback: Joi.string().min(1).max(2000).required(),
});
export const validateFeedback = (
req: Request,
res: Response,
next: NextFunction
): void => {
const { error } = feedbackSchema.validate(req.body);
if (error) {
res.status(400).json({
success: false,
error: 'Validation failed',
details: error.details.map(detail => detail.message),
});
return;
}
next();
};
// Regeneration validation schema
const regenerationSchema = Joi.object({
feedbackId: Joi.string().required(),
});
export const validateRegeneration = (
req: Request,
res: Response,
next: NextFunction
): void => {
const { error } = regenerationSchema.validate(req.body);
if (error) {
res.status(400).json({
success: false,
error: 'Validation failed',
details: error.details.map(detail => detail.message),
});
return;
}
next();
};

View File

@@ -0,0 +1,196 @@
import pool from '../config/database';
import { DocumentFeedback, CreateDocumentFeedbackInput } from './types';
import logger from '../utils/logger';
export class DocumentFeedbackModel {
/**
* Create new document feedback
*/
static async create(feedbackData: CreateDocumentFeedbackInput): Promise<DocumentFeedback> {
const { document_id, user_id, feedback, regeneration_instructions } = feedbackData;
const query = `
INSERT INTO document_feedback (document_id, user_id, feedback, regeneration_instructions)
VALUES ($1, $2, $3, $4)
RETURNING *
`;
try {
const result = await pool.query(query, [document_id, user_id, feedback, regeneration_instructions]);
logger.info(`Created feedback for document: ${document_id} by user: ${user_id}`);
return result.rows[0];
} catch (error) {
logger.error('Error creating document feedback:', error);
throw error;
}
}
/**
* Find feedback by ID
*/
static async findById(id: string): Promise<DocumentFeedback | null> {
const query = 'SELECT * FROM document_feedback WHERE id = $1';
try {
const result = await pool.query(query, [id]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding feedback by ID:', error);
throw error;
}
}
/**
* Get feedback by document ID
*/
static async findByDocumentId(documentId: string): Promise<DocumentFeedback[]> {
const query = `
SELECT df.*, u.name as user_name, u.email as user_email
FROM document_feedback df
JOIN users u ON df.user_id = u.id
WHERE df.document_id = $1
ORDER BY df.created_at DESC
`;
try {
const result = await pool.query(query, [documentId]);
return result.rows;
} catch (error) {
logger.error('Error finding feedback by document ID:', error);
throw error;
}
}
/**
* Get feedback by user ID
*/
static async findByUserId(userId: string, limit = 50, offset = 0): Promise<DocumentFeedback[]> {
const query = `
SELECT df.*, d.original_file_name
FROM document_feedback df
JOIN documents d ON df.document_id = d.id
WHERE df.user_id = $1
ORDER BY df.created_at DESC
LIMIT $2 OFFSET $3
`;
try {
const result = await pool.query(query, [userId, limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding feedback by user ID:', error);
throw error;
}
}
/**
* Get all feedback (for admin)
*/
static async findAll(limit = 100, offset = 0): Promise<(DocumentFeedback & { user_name: string, user_email: string, original_file_name: string })[]> {
const query = `
SELECT df.*, u.name as user_name, u.email as user_email, d.original_file_name
FROM document_feedback df
JOIN users u ON df.user_id = u.id
JOIN documents d ON df.document_id = d.id
ORDER BY df.created_at DESC
LIMIT $1 OFFSET $2
`;
try {
const result = await pool.query(query, [limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding all feedback:', error);
throw error;
}
}
/**
* Update feedback
*/
static async update(id: string, updates: Partial<DocumentFeedback>): Promise<DocumentFeedback | null> {
const allowedFields = ['feedback', 'regeneration_instructions'];
const updateFields: string[] = [];
const values: any[] = [];
let paramCount = 1;
// Build dynamic update query
for (const [key, value] of Object.entries(updates)) {
if (allowedFields.includes(key) && value !== undefined) {
updateFields.push(`${key} = $${paramCount}`);
values.push(value);
paramCount++;
}
}
if (updateFields.length === 0) {
return this.findById(id);
}
values.push(id);
const query = `
UPDATE document_feedback
SET ${updateFields.join(', ')}
WHERE id = $${paramCount}
RETURNING *
`;
try {
const result = await pool.query(query, values);
logger.info(`Updated feedback: ${id}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating feedback:', error);
throw error;
}
}
/**
* Delete feedback
*/
static async delete(id: string): Promise<boolean> {
const query = 'DELETE FROM document_feedback WHERE id = $1 RETURNING id';
try {
const result = await pool.query(query, [id]);
const deleted = result.rows.length > 0;
if (deleted) {
logger.info(`Deleted feedback: ${id}`);
}
return deleted;
} catch (error) {
logger.error('Error deleting feedback:', error);
throw error;
}
}
/**
* Count feedback by document
*/
static async countByDocument(documentId: string): Promise<number> {
const query = 'SELECT COUNT(*) FROM document_feedback WHERE document_id = $1';
try {
const result = await pool.query(query, [documentId]);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting feedback by document:', error);
throw error;
}
}
/**
* Count total feedback
*/
static async count(): Promise<number> {
const query = 'SELECT COUNT(*) FROM document_feedback';
try {
const result = await pool.query(query);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting feedback:', error);
throw error;
}
}
}

View File

@@ -0,0 +1,280 @@
import pool from '../config/database';
import { Document, CreateDocumentInput, ProcessingStatus } from './types';
import logger from '../utils/logger';
export class DocumentModel {
/**
* Create a new document
*/
static async create(documentData: CreateDocumentInput): Promise<Document> {
const { user_id, original_file_name, file_path, file_size } = documentData;
const query = `
INSERT INTO documents (user_id, original_file_name, file_path, file_size)
VALUES ($1, $2, $3, $4)
RETURNING *
`;
try {
const result = await pool.query(query, [user_id, original_file_name, file_path, file_size]);
logger.info(`Created document: ${original_file_name} for user: ${user_id}`);
return result.rows[0];
} catch (error) {
logger.error('Error creating document:', error);
throw error;
}
}
/**
* Find document by ID
*/
static async findById(id: string): Promise<Document | null> {
const query = 'SELECT * FROM documents WHERE id = $1';
try {
const result = await pool.query(query, [id]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding document by ID:', error);
throw error;
}
}
/**
* Find document by ID with user information
*/
static async findByIdWithUser(id: string): Promise<(Document & { user_name: string, user_email: string }) | null> {
const query = `
SELECT d.*, u.name as user_name, u.email as user_email
FROM documents d
JOIN users u ON d.user_id = u.id
WHERE d.id = $1
`;
try {
const result = await pool.query(query, [id]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding document with user:', error);
throw error;
}
}
/**
* Get documents by user ID
*/
static async findByUserId(userId: string, limit = 50, offset = 0): Promise<Document[]> {
const query = `
SELECT * FROM documents
WHERE user_id = $1
ORDER BY created_at DESC
LIMIT $2 OFFSET $3
`;
try {
const result = await pool.query(query, [userId, limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding documents by user ID:', error);
throw error;
}
}
/**
* Get all documents (for admin)
*/
static async findAll(limit = 100, offset = 0): Promise<(Document & { user_name: string, user_email: string })[]> {
const query = `
SELECT d.*, u.name as user_name, u.email as user_email
FROM documents d
JOIN users u ON d.user_id = u.id
ORDER BY d.created_at DESC
LIMIT $1 OFFSET $2
`;
try {
const result = await pool.query(query, [limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding all documents:', error);
throw error;
}
}
/**
* Update document status
*/
static async updateStatus(id: string, status: ProcessingStatus): Promise<Document | null> {
const query = `
UPDATE documents
SET status = $1,
processing_started_at = CASE WHEN $1 IN ('extracting_text', 'processing_llm', 'generating_pdf') THEN COALESCE(processing_started_at, CURRENT_TIMESTAMP) ELSE processing_started_at END,
processing_completed_at = CASE WHEN $1 IN ('completed', 'failed') THEN CURRENT_TIMESTAMP ELSE processing_completed_at END
WHERE id = $2
RETURNING *
`;
try {
const result = await pool.query(query, [status, id]);
logger.info(`Updated document ${id} status to: ${status}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating document status:', error);
throw error;
}
}
/**
* Update document with extracted text
*/
static async updateExtractedText(id: string, extractedText: string): Promise<Document | null> {
const query = `
UPDATE documents
SET extracted_text = $1
WHERE id = $2
RETURNING *
`;
try {
const result = await pool.query(query, [extractedText, id]);
logger.info(`Updated extracted text for document: ${id}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating extracted text:', error);
throw error;
}
}
/**
* Update document with generated summary
*/
static async updateGeneratedSummary(id: string, summary: string, markdownPath?: string, pdfPath?: string): Promise<Document | null> {
const query = `
UPDATE documents
SET generated_summary = $1,
summary_markdown_path = $2,
summary_pdf_path = $3
WHERE id = $4
RETURNING *
`;
try {
const result = await pool.query(query, [summary, markdownPath, pdfPath, id]);
logger.info(`Updated generated summary for document: ${id}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating generated summary:', error);
throw error;
}
}
/**
* Update document error message
*/
static async updateErrorMessage(id: string, errorMessage: string): Promise<Document | null> {
const query = `
UPDATE documents
SET error_message = $1
WHERE id = $2
RETURNING *
`;
try {
const result = await pool.query(query, [errorMessage, id]);
logger.info(`Updated error message for document: ${id}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating error message:', error);
throw error;
}
}
/**
* Delete document
*/
static async delete(id: string): Promise<boolean> {
const query = 'DELETE FROM documents WHERE id = $1 RETURNING id';
try {
const result = await pool.query(query, [id]);
const deleted = result.rows.length > 0;
if (deleted) {
logger.info(`Deleted document: ${id}`);
}
return deleted;
} catch (error) {
logger.error('Error deleting document:', error);
throw error;
}
}
/**
* Count documents by user
*/
static async countByUser(userId: string): Promise<number> {
const query = 'SELECT COUNT(*) FROM documents WHERE user_id = $1';
try {
const result = await pool.query(query, [userId]);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting documents by user:', error);
throw error;
}
}
/**
* Count total documents
*/
static async count(): Promise<number> {
const query = 'SELECT COUNT(*) FROM documents';
try {
const result = await pool.query(query);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting documents:', error);
throw error;
}
}
/**
* Get documents by status
*/
static async findByStatus(status: ProcessingStatus, limit = 50, offset = 0): Promise<Document[]> {
const query = `
SELECT * FROM documents
WHERE status = $1
ORDER BY created_at DESC
LIMIT $2 OFFSET $3
`;
try {
const result = await pool.query(query, [status, limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding documents by status:', error);
throw error;
}
}
/**
* Get documents that need processing
*/
static async findPendingProcessing(limit = 10): Promise<Document[]> {
const query = `
SELECT * FROM documents
WHERE status IN ('uploaded', 'extracting_text', 'processing_llm', 'generating_pdf')
ORDER BY created_at ASC
LIMIT $1
`;
try {
const result = await pool.query(query, [limit]);
return result.rows;
} catch (error) {
logger.error('Error finding pending processing documents:', error);
throw error;
}
}
}

View File

@@ -0,0 +1,232 @@
import pool from '../config/database';
import { DocumentVersion, CreateDocumentVersionInput } from './types';
import logger from '../utils/logger';
export class DocumentVersionModel {
/**
* Create new document version
*/
static async create(versionData: CreateDocumentVersionInput): Promise<DocumentVersion> {
const { document_id, version_number, summary_markdown, summary_pdf_path, feedback } = versionData;
const query = `
INSERT INTO document_versions (document_id, version_number, summary_markdown, summary_pdf_path, feedback)
VALUES ($1, $2, $3, $4, $5)
RETURNING *
`;
try {
const result = await pool.query(query, [document_id, version_number, summary_markdown, summary_pdf_path, feedback]);
logger.info(`Created version ${version_number} for document: ${document_id}`);
return result.rows[0];
} catch (error) {
logger.error('Error creating document version:', error);
throw error;
}
}
/**
* Find version by ID
*/
static async findById(id: string): Promise<DocumentVersion | null> {
const query = 'SELECT * FROM document_versions WHERE id = $1';
try {
const result = await pool.query(query, [id]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding version by ID:', error);
throw error;
}
}
/**
* Get versions by document ID
*/
static async findByDocumentId(documentId: string): Promise<DocumentVersion[]> {
const query = `
SELECT * FROM document_versions
WHERE document_id = $1
ORDER BY version_number DESC
`;
try {
const result = await pool.query(query, [documentId]);
return result.rows;
} catch (error) {
logger.error('Error finding versions by document ID:', error);
throw error;
}
}
/**
* Get latest version by document ID
*/
static async findLatestByDocumentId(documentId: string): Promise<DocumentVersion | null> {
const query = `
SELECT * FROM document_versions
WHERE document_id = $1
ORDER BY version_number DESC
LIMIT 1
`;
try {
const result = await pool.query(query, [documentId]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding latest version by document ID:', error);
throw error;
}
}
/**
* Get specific version by document ID and version number
*/
static async findByDocumentIdAndVersion(documentId: string, versionNumber: number): Promise<DocumentVersion | null> {
const query = `
SELECT * FROM document_versions
WHERE document_id = $1 AND version_number = $2
`;
try {
const result = await pool.query(query, [documentId, versionNumber]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding version by document ID and version number:', error);
throw error;
}
}
/**
* Get next version number for a document
*/
static async getNextVersionNumber(documentId: string): Promise<number> {
const query = `
SELECT COALESCE(MAX(version_number), 0) + 1 as next_version
FROM document_versions
WHERE document_id = $1
`;
try {
const result = await pool.query(query, [documentId]);
return parseInt(result.rows[0].next_version);
} catch (error) {
logger.error('Error getting next version number:', error);
throw error;
}
}
/**
* Update version
*/
static async update(id: string, updates: Partial<DocumentVersion>): Promise<DocumentVersion | null> {
const allowedFields = ['summary_markdown', 'summary_pdf_path', 'feedback'];
const updateFields: string[] = [];
const values: any[] = [];
let paramCount = 1;
// Build dynamic update query
for (const [key, value] of Object.entries(updates)) {
if (allowedFields.includes(key) && value !== undefined) {
updateFields.push(`${key} = $${paramCount}`);
values.push(value);
paramCount++;
}
}
if (updateFields.length === 0) {
return this.findById(id);
}
values.push(id);
const query = `
UPDATE document_versions
SET ${updateFields.join(', ')}
WHERE id = $${paramCount}
RETURNING *
`;
try {
const result = await pool.query(query, values);
logger.info(`Updated version: ${id}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating version:', error);
throw error;
}
}
/**
* Delete version
*/
static async delete(id: string): Promise<boolean> {
const query = 'DELETE FROM document_versions WHERE id = $1 RETURNING id';
try {
const result = await pool.query(query, [id]);
const deleted = result.rows.length > 0;
if (deleted) {
logger.info(`Deleted version: ${id}`);
}
return deleted;
} catch (error) {
logger.error('Error deleting version:', error);
throw error;
}
}
/**
* Delete all versions for a document
*/
static async deleteByDocumentId(documentId: string): Promise<number> {
const query = 'DELETE FROM document_versions WHERE document_id = $1 RETURNING id';
try {
const result = await pool.query(query, [documentId]);
const deletedCount = result.rows.length;
if (deletedCount > 0) {
logger.info(`Deleted ${deletedCount} versions for document: ${documentId}`);
}
return deletedCount;
} catch (error) {
logger.error('Error deleting versions by document ID:', error);
throw error;
}
}
/**
* Count versions by document
*/
static async countByDocument(documentId: string): Promise<number> {
const query = 'SELECT COUNT(*) FROM document_versions WHERE document_id = $1';
try {
const result = await pool.query(query, [documentId]);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting versions by document:', error);
throw error;
}
}
/**
* Get version history with document info
*/
static async getVersionHistory(documentId: string): Promise<(DocumentVersion & { original_file_name: string })[]> {
const query = `
SELECT dv.*, d.original_file_name
FROM document_versions dv
JOIN documents d ON dv.document_id = d.id
WHERE dv.document_id = $1
ORDER BY dv.version_number DESC
`;
try {
const result = await pool.query(query, [documentId]);
return result.rows;
} catch (error) {
logger.error('Error getting version history:', error);
throw error;
}
}
}

View File

@@ -0,0 +1,305 @@
import pool from '../config/database';
import { ProcessingJob, CreateProcessingJobInput, JobType, JobStatus } from './types';
import logger from '../utils/logger';
export class ProcessingJobModel {
/**
* Create new processing job
*/
static async create(jobData: CreateProcessingJobInput): Promise<ProcessingJob> {
const { document_id, type } = jobData;
const query = `
INSERT INTO processing_jobs (document_id, type, status, progress)
VALUES ($1, $2, 'pending', 0)
RETURNING *
`;
try {
const result = await pool.query(query, [document_id, type]);
logger.info(`Created processing job: ${type} for document: ${document_id}`);
return result.rows[0];
} catch (error) {
logger.error('Error creating processing job:', error);
throw error;
}
}
/**
* Find job by ID
*/
static async findById(id: string): Promise<ProcessingJob | null> {
const query = 'SELECT * FROM processing_jobs WHERE id = $1';
try {
const result = await pool.query(query, [id]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding job by ID:', error);
throw error;
}
}
/**
* Get jobs by document ID
*/
static async findByDocumentId(documentId: string): Promise<ProcessingJob[]> {
const query = `
SELECT * FROM processing_jobs
WHERE document_id = $1
ORDER BY created_at DESC
`;
try {
const result = await pool.query(query, [documentId]);
return result.rows;
} catch (error) {
logger.error('Error finding jobs by document ID:', error);
throw error;
}
}
/**
* Get jobs by type
*/
static async findByType(type: JobType, limit = 50, offset = 0): Promise<ProcessingJob[]> {
const query = `
SELECT * FROM processing_jobs
WHERE type = $1
ORDER BY created_at DESC
LIMIT $2 OFFSET $3
`;
try {
const result = await pool.query(query, [type, limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding jobs by type:', error);
throw error;
}
}
/**
* Get jobs by status
*/
static async findByStatus(status: JobStatus, limit = 50, offset = 0): Promise<ProcessingJob[]> {
const query = `
SELECT * FROM processing_jobs
WHERE status = $1
ORDER BY created_at ASC
LIMIT $2 OFFSET $3
`;
try {
const result = await pool.query(query, [status, limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding jobs by status:', error);
throw error;
}
}
/**
* Get pending jobs (for job queue processing)
*/
static async findPendingJobs(limit = 10): Promise<ProcessingJob[]> {
const query = `
SELECT * FROM processing_jobs
WHERE status = 'pending'
ORDER BY created_at ASC
LIMIT $1
`;
try {
const result = await pool.query(query, [limit]);
return result.rows;
} catch (error) {
logger.error('Error finding pending jobs:', error);
throw error;
}
}
/**
* Get all jobs (for admin)
*/
static async findAll(limit = 100, offset = 0): Promise<(ProcessingJob & { original_file_name: string, user_name: string })[]> {
const query = `
SELECT pj.*, d.original_file_name, u.name as user_name
FROM processing_jobs pj
JOIN documents d ON pj.document_id = d.id
JOIN users u ON d.user_id = u.id
ORDER BY pj.created_at DESC
LIMIT $1 OFFSET $2
`;
try {
const result = await pool.query(query, [limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding all jobs:', error);
throw error;
}
}
/**
* Update job status
*/
static async updateStatus(id: string, status: JobStatus): Promise<ProcessingJob | null> {
const query = `
UPDATE processing_jobs
SET status = $1,
started_at = CASE WHEN $1 = 'processing' THEN COALESCE(started_at, CURRENT_TIMESTAMP) ELSE started_at END,
completed_at = CASE WHEN $1 IN ('completed', 'failed') THEN CURRENT_TIMESTAMP ELSE completed_at END
WHERE id = $2
RETURNING *
`;
try {
const result = await pool.query(query, [status, id]);
logger.info(`Updated job ${id} status to: ${status}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating job status:', error);
throw error;
}
}
/**
* Update job progress
*/
static async updateProgress(id: string, progress: number): Promise<ProcessingJob | null> {
const query = `
UPDATE processing_jobs
SET progress = $1
WHERE id = $2
RETURNING *
`;
try {
const result = await pool.query(query, [progress, id]);
logger.info(`Updated job ${id} progress to: ${progress}%`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating job progress:', error);
throw error;
}
}
/**
* Update job error message
*/
static async updateErrorMessage(id: string, errorMessage: string): Promise<ProcessingJob | null> {
const query = `
UPDATE processing_jobs
SET error_message = $1
WHERE id = $2
RETURNING *
`;
try {
const result = await pool.query(query, [errorMessage, id]);
logger.info(`Updated error message for job: ${id}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating job error message:', error);
throw error;
}
}
/**
* Delete job
*/
static async delete(id: string): Promise<boolean> {
const query = 'DELETE FROM processing_jobs WHERE id = $1 RETURNING id';
try {
const result = await pool.query(query, [id]);
const deleted = result.rows.length > 0;
if (deleted) {
logger.info(`Deleted job: ${id}`);
}
return deleted;
} catch (error) {
logger.error('Error deleting job:', error);
throw error;
}
}
/**
* Delete jobs by document ID
*/
static async deleteByDocumentId(documentId: string): Promise<number> {
const query = 'DELETE FROM processing_jobs WHERE document_id = $1 RETURNING id';
try {
const result = await pool.query(query, [documentId]);
const deletedCount = result.rows.length;
if (deletedCount > 0) {
logger.info(`Deleted ${deletedCount} jobs for document: ${documentId}`);
}
return deletedCount;
} catch (error) {
logger.error('Error deleting jobs by document ID:', error);
throw error;
}
}
/**
* Count jobs by status
*/
static async countByStatus(status: JobStatus): Promise<number> {
const query = 'SELECT COUNT(*) FROM processing_jobs WHERE status = $1';
try {
const result = await pool.query(query, [status]);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting jobs by status:', error);
throw error;
}
}
/**
* Count total jobs
*/
static async count(): Promise<number> {
const query = 'SELECT COUNT(*) FROM processing_jobs';
try {
const result = await pool.query(query);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting jobs:', error);
throw error;
}
}
/**
* Get job statistics
*/
static async getJobStatistics(): Promise<{
total: number;
pending: number;
processing: number;
completed: number;
failed: number;
}> {
const query = `
SELECT
COUNT(*) as total,
COUNT(CASE WHEN status = 'pending' THEN 1 END) as pending,
COUNT(CASE WHEN status = 'processing' THEN 1 END) as processing,
COUNT(CASE WHEN status = 'completed' THEN 1 END) as completed,
COUNT(CASE WHEN status = 'failed' THEN 1 END) as failed
FROM processing_jobs
`;
try {
const result = await pool.query(query);
return result.rows[0];
} catch (error) {
logger.error('Error getting job statistics:', error);
throw error;
}
}
}

View File

@@ -0,0 +1,181 @@
import pool from '../config/database';
import { User, CreateUserInput } from './types';
import logger from '../utils/logger';
export class UserModel {
/**
* Create a new user
*/
static async create(userData: CreateUserInput): Promise<User> {
const { email, name, password, role = 'user' } = userData;
const query = `
INSERT INTO users (email, name, password_hash, role)
VALUES ($1, $2, $3, $4)
RETURNING *
`;
try {
const result = await pool.query(query, [email, name, password, role]);
logger.info(`Created user: ${email}`);
return result.rows[0];
} catch (error) {
logger.error('Error creating user:', error);
throw error;
}
}
/**
* Find user by ID
*/
static async findById(id: string): Promise<User | null> {
const query = 'SELECT * FROM users WHERE id = $1 AND is_active = true';
try {
const result = await pool.query(query, [id]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding user by ID:', error);
throw error;
}
}
/**
* Find user by email
*/
static async findByEmail(email: string): Promise<User | null> {
const query = 'SELECT * FROM users WHERE email = $1 AND is_active = true';
try {
const result = await pool.query(query, [email]);
return result.rows[0] || null;
} catch (error) {
logger.error('Error finding user by email:', error);
throw error;
}
}
/**
* Get all users (for admin)
*/
static async findAll(limit = 100, offset = 0): Promise<User[]> {
const query = `
SELECT * FROM users
WHERE is_active = true
ORDER BY created_at DESC
LIMIT $1 OFFSET $2
`;
try {
const result = await pool.query(query, [limit, offset]);
return result.rows;
} catch (error) {
logger.error('Error finding all users:', error);
throw error;
}
}
/**
* Update user
*/
static async update(id: string, updates: Partial<User>): Promise<User | null> {
const allowedFields = ['name', 'email', 'role', 'is_active', 'last_login'];
const updateFields: string[] = [];
const values: any[] = [];
let paramCount = 1;
// Build dynamic update query
for (const [key, value] of Object.entries(updates)) {
if (allowedFields.includes(key) && value !== undefined) {
updateFields.push(`${key} = $${paramCount}`);
values.push(value);
paramCount++;
}
}
if (updateFields.length === 0) {
return this.findById(id);
}
values.push(id);
const query = `
UPDATE users
SET ${updateFields.join(', ')}
WHERE id = $${paramCount} AND is_active = true
RETURNING *
`;
try {
const result = await pool.query(query, values);
logger.info(`Updated user: ${id}`);
return result.rows[0] || null;
} catch (error) {
logger.error('Error updating user:', error);
throw error;
}
}
/**
* Update last login timestamp
*/
static async updateLastLogin(id: string): Promise<void> {
const query = 'UPDATE users SET last_login = CURRENT_TIMESTAMP WHERE id = $1';
try {
await pool.query(query, [id]);
logger.info(`Updated last login for user: ${id}`);
} catch (error) {
logger.error('Error updating last login:', error);
throw error;
}
}
/**
* Soft delete user (set is_active to false)
*/
static async delete(id: string): Promise<boolean> {
const query = 'UPDATE users SET is_active = false WHERE id = $1 RETURNING id';
try {
const result = await pool.query(query, [id]);
const deleted = result.rows.length > 0;
if (deleted) {
logger.info(`Soft deleted user: ${id}`);
}
return deleted;
} catch (error) {
logger.error('Error deleting user:', error);
throw error;
}
}
/**
* Count total users
*/
static async count(): Promise<number> {
const query = 'SELECT COUNT(*) FROM users WHERE is_active = true';
try {
const result = await pool.query(query);
return parseInt(result.rows[0].count);
} catch (error) {
logger.error('Error counting users:', error);
throw error;
}
}
/**
* Check if email exists
*/
static async emailExists(email: string): Promise<boolean> {
const query = 'SELECT id FROM users WHERE email = $1 AND is_active = true';
try {
const result = await pool.query(query, [email]);
return result.rows.length > 0;
} catch (error) {
logger.error('Error checking email existence:', error);
throw error;
}
}
}

View File

@@ -0,0 +1,338 @@
import { DocumentModel } from '../DocumentModel';
import { CreateDocumentInput } from '../types';
// Mock the database pool
jest.mock('../../config/database', () => ({
query: jest.fn()
}));
// Mock the logger
jest.mock('../../utils/logger', () => ({
info: jest.fn(),
error: jest.fn(),
warn: jest.fn()
}));
describe('DocumentModel', () => {
let mockPool: any;
beforeEach(() => {
jest.clearAllMocks();
mockPool = require('../../config/database');
});
describe('create', () => {
it('should create a new document successfully', async () => {
const documentData: CreateDocumentInput = {
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000
};
const mockDocument = {
id: '123e4567-e89b-12d3-a456-426614174001',
...documentData,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date()
};
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] });
const result = await DocumentModel.create(documentData);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('INSERT INTO documents'),
[documentData.user_id, documentData.original_file_name, documentData.file_path, documentData.file_size]
);
expect(result).toEqual(mockDocument);
});
it('should handle database errors', async () => {
const documentData: CreateDocumentInput = {
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000
};
const error = new Error('Database error');
mockPool.query.mockRejectedValueOnce(error);
await expect(DocumentModel.create(documentData)).rejects.toThrow('Database error');
});
});
describe('findById', () => {
it('should find document by ID successfully', async () => {
const documentId = '123e4567-e89b-12d3-a456-426614174001';
const mockDocument = {
id: documentId,
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date()
};
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] });
const result = await DocumentModel.findById(documentId);
expect(mockPool.query).toHaveBeenCalledWith(
'SELECT * FROM documents WHERE id = $1',
[documentId]
);
expect(result).toEqual(mockDocument);
});
it('should return null when document not found', async () => {
const documentId = '123e4567-e89b-12d3-a456-426614174001';
mockPool.query.mockResolvedValueOnce({ rows: [] });
const result = await DocumentModel.findById(documentId);
expect(result).toBeNull();
});
});
describe('findByUserId', () => {
it('should find documents by user ID successfully', async () => {
const userId = '123e4567-e89b-12d3-a456-426614174000';
const mockDocuments = [
{
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: userId,
original_file_name: 'test1.pdf',
file_path: '/uploads/test1.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date()
},
{
id: '123e4567-e89b-12d3-a456-426614174002',
user_id: userId,
original_file_name: 'test2.pdf',
file_path: '/uploads/test2.pdf',
file_size: 2048000,
uploaded_at: new Date(),
status: 'completed',
created_at: new Date(),
updated_at: new Date()
}
];
mockPool.query.mockResolvedValueOnce({ rows: mockDocuments });
const result = await DocumentModel.findByUserId(userId);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('SELECT * FROM documents'),
[userId, 50, 0]
);
expect(result).toEqual(mockDocuments);
});
});
describe('updateStatus', () => {
it('should update document status successfully', async () => {
const documentId = '123e4567-e89b-12d3-a456-426614174001';
const newStatus = 'processing_llm';
const mockUpdatedDocument = {
id: documentId,
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: newStatus,
processing_started_at: new Date(),
created_at: new Date(),
updated_at: new Date()
};
mockPool.query.mockResolvedValueOnce({ rows: [mockUpdatedDocument] });
const result = await DocumentModel.updateStatus(documentId, newStatus);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('UPDATE documents'),
[newStatus, documentId]
);
expect(result).toEqual(mockUpdatedDocument);
});
});
describe('updateExtractedText', () => {
it('should update extracted text successfully', async () => {
const documentId = '123e4567-e89b-12d3-a456-426614174001';
const extractedText = 'This is the extracted text from the PDF';
const mockUpdatedDocument = {
id: documentId,
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'extracting_text',
extracted_text: extractedText,
created_at: new Date(),
updated_at: new Date()
};
mockPool.query.mockResolvedValueOnce({ rows: [mockUpdatedDocument] });
const result = await DocumentModel.updateExtractedText(documentId, extractedText);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('UPDATE documents'),
[extractedText, documentId]
);
expect(result).toEqual(mockUpdatedDocument);
});
});
describe('updateGeneratedSummary', () => {
it('should update generated summary successfully', async () => {
const documentId = '123e4567-e89b-12d3-a456-426614174001';
const summary = 'Generated summary content';
const markdownPath = '/summaries/test.md';
const pdfPath = '/summaries/test.pdf';
const mockUpdatedDocument = {
id: documentId,
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'completed',
generated_summary: summary,
summary_markdown_path: markdownPath,
summary_pdf_path: pdfPath,
created_at: new Date(),
updated_at: new Date()
};
mockPool.query.mockResolvedValueOnce({ rows: [mockUpdatedDocument] });
const result = await DocumentModel.updateGeneratedSummary(documentId, summary, markdownPath, pdfPath);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('UPDATE documents'),
[summary, markdownPath, pdfPath, documentId]
);
expect(result).toEqual(mockUpdatedDocument);
});
});
describe('delete', () => {
it('should delete document successfully', async () => {
const documentId = '123e4567-e89b-12d3-a456-426614174001';
mockPool.query.mockResolvedValueOnce({ rows: [{ id: documentId }] });
const result = await DocumentModel.delete(documentId);
expect(mockPool.query).toHaveBeenCalledWith(
'DELETE FROM documents WHERE id = $1 RETURNING id',
[documentId]
);
expect(result).toBe(true);
});
it('should return false when document not found', async () => {
const documentId = '123e4567-e89b-12d3-a456-426614174001';
mockPool.query.mockResolvedValueOnce({ rows: [] });
const result = await DocumentModel.delete(documentId);
expect(result).toBe(false);
});
});
describe('countByUser', () => {
it('should return correct document count for user', async () => {
const userId = '123e4567-e89b-12d3-a456-426614174000';
const expectedCount = 5;
mockPool.query.mockResolvedValueOnce({ rows: [{ count: expectedCount.toString() }] });
const result = await DocumentModel.countByUser(userId);
expect(mockPool.query).toHaveBeenCalledWith(
'SELECT COUNT(*) FROM documents WHERE user_id = $1',
[userId]
);
expect(result).toBe(expectedCount);
});
});
describe('findByStatus', () => {
it('should find documents by status successfully', async () => {
const status = 'completed';
const mockDocuments = [
{
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test1.pdf',
file_path: '/uploads/test1.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status,
created_at: new Date(),
updated_at: new Date()
}
];
mockPool.query.mockResolvedValueOnce({ rows: mockDocuments });
const result = await DocumentModel.findByStatus(status);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('SELECT * FROM documents'),
[status, 50, 0]
);
expect(result).toEqual(mockDocuments);
});
});
describe('findPendingProcessing', () => {
it('should find pending processing documents', async () => {
const mockDocuments = [
{
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date()
}
];
mockPool.query.mockResolvedValueOnce({ rows: mockDocuments });
const result = await DocumentModel.findPendingProcessing();
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('SELECT * FROM documents'),
[10]
);
expect(result).toEqual(mockDocuments);
});
});
});

View File

@@ -0,0 +1,227 @@
import { UserModel } from '../UserModel';
import { CreateUserInput } from '../types';
// Mock the database pool
jest.mock('../../config/database', () => ({
query: jest.fn()
}));
// Mock the logger
jest.mock('../../utils/logger', () => ({
info: jest.fn(),
error: jest.fn(),
warn: jest.fn()
}));
describe('UserModel', () => {
let mockPool: any;
beforeEach(() => {
jest.clearAllMocks();
mockPool = require('../../config/database');
});
describe('create', () => {
it('should create a new user successfully', async () => {
const userData: CreateUserInput = {
email: 'test@example.com',
name: 'Test User',
password: 'password123',
role: 'user'
};
const mockUser = {
id: '123e4567-e89b-12d3-a456-426614174000',
email: userData.email,
name: userData.name,
password_hash: 'hashed_password',
role: userData.role,
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] });
const result = await UserModel.create(userData);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('INSERT INTO users'),
[userData.email, userData.name, userData.password, userData.role]
);
expect(result).toEqual(mockUser);
});
it('should handle database errors', async () => {
const userData: CreateUserInput = {
email: 'test@example.com',
name: 'Test User',
password: 'password123'
};
const error = new Error('Database error');
mockPool.query.mockRejectedValueOnce(error);
await expect(UserModel.create(userData)).rejects.toThrow('Database error');
});
});
describe('findById', () => {
it('should find user by ID successfully', async () => {
const userId = '123e4567-e89b-12d3-a456-426614174000';
const mockUser = {
id: userId,
email: 'test@example.com',
name: 'Test User',
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] });
const result = await UserModel.findById(userId);
expect(mockPool.query).toHaveBeenCalledWith(
'SELECT * FROM users WHERE id = $1 AND is_active = true',
[userId]
);
expect(result).toEqual(mockUser);
});
it('should return null when user not found', async () => {
const userId = '123e4567-e89b-12d3-a456-426614174000';
mockPool.query.mockResolvedValueOnce({ rows: [] });
const result = await UserModel.findById(userId);
expect(result).toBeNull();
});
});
describe('findByEmail', () => {
it('should find user by email successfully', async () => {
const email = 'test@example.com';
const mockUser = {
id: '123e4567-e89b-12d3-a456-426614174000',
email,
name: 'Test User',
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] });
const result = await UserModel.findByEmail(email);
expect(mockPool.query).toHaveBeenCalledWith(
'SELECT * FROM users WHERE email = $1 AND is_active = true',
[email]
);
expect(result).toEqual(mockUser);
});
});
describe('update', () => {
it('should update user successfully', async () => {
const userId = '123e4567-e89b-12d3-a456-426614174000';
const updates = {
name: 'Updated Name',
email: 'updated@example.com'
};
const mockUpdatedUser = {
id: userId,
...updates,
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
mockPool.query.mockResolvedValueOnce({ rows: [mockUpdatedUser] });
const result = await UserModel.update(userId, updates);
expect(mockPool.query).toHaveBeenCalledWith(
expect.stringContaining('UPDATE users'),
expect.arrayContaining([updates.name, updates.email, userId])
);
expect(result).toEqual(mockUpdatedUser);
});
});
describe('delete', () => {
it('should soft delete user successfully', async () => {
const userId = '123e4567-e89b-12d3-a456-426614174000';
mockPool.query.mockResolvedValueOnce({ rows: [{ id: userId }] });
const result = await UserModel.delete(userId);
expect(mockPool.query).toHaveBeenCalledWith(
'UPDATE users SET is_active = false WHERE id = $1 RETURNING id',
[userId]
);
expect(result).toBe(true);
});
it('should return false when user not found', async () => {
const userId = '123e4567-e89b-12d3-a456-426614174000';
mockPool.query.mockResolvedValueOnce({ rows: [] });
const result = await UserModel.delete(userId);
expect(result).toBe(false);
});
});
describe('emailExists', () => {
it('should return true when email exists', async () => {
const email = 'test@example.com';
mockPool.query.mockResolvedValueOnce({ rows: [{ id: '123' }] });
const result = await UserModel.emailExists(email);
expect(mockPool.query).toHaveBeenCalledWith(
'SELECT id FROM users WHERE email = $1 AND is_active = true',
[email]
);
expect(result).toBe(true);
});
it('should return false when email does not exist', async () => {
const email = 'test@example.com';
mockPool.query.mockResolvedValueOnce({ rows: [] });
const result = await UserModel.emailExists(email);
expect(result).toBe(false);
});
});
describe('count', () => {
it('should return correct user count', async () => {
const expectedCount = 5;
mockPool.query.mockResolvedValueOnce({ rows: [{ count: expectedCount.toString() }] });
const result = await UserModel.count();
expect(mockPool.query).toHaveBeenCalledWith(
'SELECT COUNT(*) FROM users WHERE is_active = true'
);
expect(result).toBe(expectedCount);
});
});
});

View File

@@ -0,0 +1,293 @@
import { UserModel } from '../UserModel';
import { DocumentModel } from '../DocumentModel';
import { DocumentFeedbackModel } from '../DocumentFeedbackModel';
import { DocumentVersionModel } from '../DocumentVersionModel';
import { ProcessingJobModel } from '../ProcessingJobModel';
// Mock the database pool
jest.mock('../../config/database', () => ({
query: jest.fn()
}));
// Mock the logger
jest.mock('../../utils/logger', () => ({
info: jest.fn(),
error: jest.fn(),
warn: jest.fn()
}));
describe('Database Models Integration', () => {
let mockPool: any;
beforeEach(() => {
jest.clearAllMocks();
mockPool = require('../../config/database');
});
describe('User and Document Relationship', () => {
it('should handle user-document relationship correctly', async () => {
const mockUser = {
id: '123e4567-e89b-12d3-a456-426614174000',
email: 'test@example.com',
name: 'Test User',
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
const mockDocument = {
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: mockUser.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date()
};
// Mock user creation
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] });
// Mock document creation
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] });
// Mock finding documents by user
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] });
// Test the workflow
const user = await UserModel.create({
email: 'test@example.com',
name: 'Test User',
password: 'password123'
});
const document = await DocumentModel.create({
user_id: user.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000
});
const userDocuments = await DocumentModel.findByUserId(user.id);
expect(user.id).toBe(mockUser.id);
expect(document.user_id).toBe(user.id);
expect(userDocuments).toHaveLength(1);
expect(userDocuments[0]?.id).toBe(document.id);
});
});
describe('Document Processing Workflow', () => {
it('should handle complete document processing workflow', async () => {
const mockUser = {
id: '123e4567-e89b-12d3-a456-426614174000',
email: 'test@example.com',
name: 'Test User',
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
const mockDocument = {
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: mockUser.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date()
};
const mockProcessingJob = {
id: '123e4567-e89b-12d3-a456-426614174002',
document_id: mockDocument.id,
type: 'text_extraction',
status: 'pending',
progress: 0,
created_at: new Date()
};
// Mock the workflow
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] }); // Create user
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] }); // Create document
mockPool.query.mockResolvedValueOnce({ rows: [mockProcessingJob] }); // Create job
mockPool.query.mockResolvedValueOnce({ rows: [{ ...mockDocument, status: 'extracting_text' }] }); // Update status
mockPool.query.mockResolvedValueOnce({ rows: [{ ...mockDocument, extracted_text: 'Extracted text' }] }); // Update text
mockPool.query.mockResolvedValueOnce({ rows: [{ ...mockDocument, status: 'completed' }] }); // Complete
// Execute workflow
const user = await UserModel.create({
email: 'test@example.com',
name: 'Test User',
password: 'password123'
});
const document = await DocumentModel.create({
user_id: user.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000
});
const job = await ProcessingJobModel.create({
document_id: document.id,
type: 'text_extraction'
});
await DocumentModel.updateStatus(document.id, 'extracting_text');
await DocumentModel.updateExtractedText(document.id, 'Extracted text');
await DocumentModel.updateStatus(document.id, 'completed');
expect(job.document_id).toBe(document.id);
expect(job.type).toBe('text_extraction');
});
});
describe('Document Feedback and Versioning', () => {
it('should handle feedback and versioning workflow', async () => {
const mockUser = {
id: '123e4567-e89b-12d3-a456-426614174000',
email: 'test@example.com',
name: 'Test User',
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
const mockDocument = {
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: mockUser.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'completed',
created_at: new Date(),
updated_at: new Date()
};
const mockFeedback = {
id: '123e4567-e89b-12d3-a456-426614174003',
document_id: mockDocument.id,
user_id: mockUser.id,
feedback: 'Please make the summary more concise',
regeneration_instructions: 'Focus on key points only',
created_at: new Date()
};
const mockVersion = {
id: '123e4567-e89b-12d3-a456-426614174004',
document_id: mockDocument.id,
version_number: 2,
summary_markdown: '# Updated Summary\n\nMore concise version',
summary_pdf_path: '/summaries/test_v2.pdf',
feedback: 'Please make the summary more concise',
created_at: new Date()
};
// Mock the workflow
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] }); // Create user
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] }); // Create document
mockPool.query.mockResolvedValueOnce({ rows: [mockFeedback] }); // Create feedback
mockPool.query.mockResolvedValueOnce({ rows: [mockVersion] }); // Create version
// Execute workflow
const user = await UserModel.create({
email: 'test@example.com',
name: 'Test User',
password: 'password123'
});
const document = await DocumentModel.create({
user_id: user.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000
});
const feedback = await DocumentFeedbackModel.create({
document_id: document.id,
user_id: user.id,
feedback: 'Please make the summary more concise',
regeneration_instructions: 'Focus on key points only'
});
const version = await DocumentVersionModel.create({
document_id: document.id,
version_number: 2,
summary_markdown: '# Updated Summary\n\nMore concise version',
summary_pdf_path: '/summaries/test_v2.pdf',
feedback: 'Please make the summary more concise'
});
expect(feedback.document_id).toBe(document.id);
expect(feedback.user_id).toBe(user.id);
expect(version.document_id).toBe(document.id);
expect(version.version_number).toBe(2);
});
});
describe('Model Relationships', () => {
it('should maintain referential integrity', async () => {
const mockUser = {
id: '123e4567-e89b-12d3-a456-426614174000',
email: 'test@example.com',
name: 'Test User',
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true
};
const mockDocument = {
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: mockUser.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date()
};
// Mock queries
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] }); // Create user
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] }); // Create document
mockPool.query.mockResolvedValueOnce({ rows: [mockUser] }); // Find user
mockPool.query.mockResolvedValueOnce({ rows: [mockDocument] }); // Find document
// Test relationships
const user = await UserModel.create({
email: 'test@example.com',
name: 'Test User',
password: 'password123'
});
const document = await DocumentModel.create({
user_id: user.id,
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000
});
const foundUser = await UserModel.findById(user.id);
const foundDocument = await DocumentModel.findById(document.id);
expect(foundUser?.id).toBe(user.id);
expect(foundDocument?.id).toBe(document.id);
expect(foundDocument?.user_id).toBe(user.id);
});
});
});

View File

@@ -0,0 +1,13 @@
// Export all models
export { UserModel } from './UserModel';
export { DocumentModel } from './DocumentModel';
export { DocumentFeedbackModel } from './DocumentFeedbackModel';
export { DocumentVersionModel } from './DocumentVersionModel';
export { ProcessingJobModel } from './ProcessingJobModel';
// Export types
export * from './types';
// Export database utilities
export { default as DatabaseMigrator } from './migrate';
export { default as DatabaseSeeder } from './seed';

View File

@@ -0,0 +1,170 @@
import fs from 'fs';
import path from 'path';
import pool from '../config/database';
import logger from '../utils/logger';
interface Migration {
id: string;
name: string;
sql: string;
}
class DatabaseMigrator {
private migrationsDir: string;
constructor() {
this.migrationsDir = path.join(__dirname, 'migrations');
}
/**
* Get all migration files
*/
private async getMigrationFiles(): Promise<string[]> {
try {
const files = await fs.promises.readdir(this.migrationsDir);
return files
.filter(file => file.endsWith('.sql'))
.sort(); // Sort to ensure proper order
} catch (error) {
logger.error('Error reading migrations directory:', error);
throw error;
}
}
/**
* Load migration content
*/
private async loadMigration(fileName: string): Promise<Migration> {
const filePath = path.join(this.migrationsDir, fileName);
const sql = await fs.promises.readFile(filePath, 'utf-8');
return {
id: fileName.replace('.sql', ''),
name: fileName,
sql
};
}
/**
* Create migrations table if it doesn't exist
*/
private async createMigrationsTable(): Promise<void> {
const query = `
CREATE TABLE IF NOT EXISTS migrations (
id VARCHAR(255) PRIMARY KEY,
name VARCHAR(255) NOT NULL,
executed_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);
`;
try {
await pool.query(query);
logger.info('Migrations table created or already exists');
} catch (error) {
logger.error('Error creating migrations table:', error);
throw error;
}
}
/**
* Check if migration has been executed
*/
private async isMigrationExecuted(migrationId: string): Promise<boolean> {
const query = 'SELECT id FROM migrations WHERE id = $1';
try {
const result = await pool.query(query, [migrationId]);
return result.rows.length > 0;
} catch (error) {
logger.error('Error checking migration status:', error);
throw error;
}
}
/**
* Mark migration as executed
*/
private async markMigrationExecuted(migrationId: string, name: string): Promise<void> {
const query = 'INSERT INTO migrations (id, name) VALUES ($1, $2)';
try {
await pool.query(query, [migrationId, name]);
logger.info(`Migration marked as executed: ${name}`);
} catch (error) {
logger.error('Error marking migration as executed:', error);
throw error;
}
}
/**
* Execute a single migration
*/
private async executeMigration(migration: Migration): Promise<void> {
try {
logger.info(`Executing migration: ${migration.name}`);
// Execute the migration SQL
await pool.query(migration.sql);
// Mark as executed
await this.markMigrationExecuted(migration.id, migration.name);
logger.info(`Migration completed: ${migration.name}`);
} catch (error) {
logger.error(`Migration failed: ${migration.name}`, error);
throw error;
}
}
/**
* Run all pending migrations
*/
async migrate(): Promise<void> {
try {
logger.info('Starting database migration...');
// Create migrations table
await this.createMigrationsTable();
// Get all migration files
const migrationFiles = await this.getMigrationFiles();
logger.info(`Found ${migrationFiles.length} migration files`);
// Execute each migration
for (const fileName of migrationFiles) {
const migration = await this.loadMigration(fileName);
// Check if already executed
const isExecuted = await this.isMigrationExecuted(migration.id);
if (!isExecuted) {
await this.executeMigration(migration);
} else {
logger.info(`Migration already executed: ${migration.name}`);
}
}
logger.info('Database migration completed successfully');
} catch (error) {
logger.error('Database migration failed:', error);
throw error;
}
}
/**
* Get migration status
*/
async getMigrationStatus(): Promise<{ id: string; name: string; executed_at: Date }[]> {
const query = 'SELECT id, name, executed_at FROM migrations ORDER BY executed_at';
try {
const result = await pool.query(query);
return result.rows;
} catch (error) {
logger.error('Error getting migration status:', error);
throw error;
}
}
}
export default DatabaseMigrator;

View File

@@ -0,0 +1,39 @@
-- Migration: Create users table
-- Created: 2024-01-01
CREATE TABLE IF NOT EXISTS users (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
email VARCHAR(255) UNIQUE NOT NULL,
name VARCHAR(255) NOT NULL,
password_hash VARCHAR(255) NOT NULL,
role VARCHAR(20) NOT NULL DEFAULT 'user' CHECK (role IN ('user', 'admin')),
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
last_login TIMESTAMP WITH TIME ZONE,
is_active BOOLEAN DEFAULT true,
CONSTRAINT users_email_check CHECK (email ~* '^[A-Za-z0-9._%+-]+@[A-Za-z0-9.-]+\.[A-Za-z]{2,}$')
);
-- Create index on email for faster lookups
CREATE INDEX IF NOT EXISTS idx_users_email ON users(email);
-- Create index on role for admin queries
CREATE INDEX IF NOT EXISTS idx_users_role ON users(role);
-- Create index on is_active for filtering active users
CREATE INDEX IF NOT EXISTS idx_users_is_active ON users(is_active);
-- Create trigger to update updated_at timestamp
CREATE OR REPLACE FUNCTION update_updated_at_column()
RETURNS TRIGGER AS $$
BEGIN
NEW.updated_at = CURRENT_TIMESTAMP;
RETURN NEW;
END;
$$ language 'plpgsql';
CREATE TRIGGER update_users_updated_at
BEFORE UPDATE ON users
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();

View File

@@ -0,0 +1,36 @@
-- Migration: Create documents table
-- Created: 2024-01-01
CREATE TABLE IF NOT EXISTS documents (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
original_file_name VARCHAR(500) NOT NULL,
file_path VARCHAR(1000) NOT NULL,
file_size BIGINT NOT NULL CHECK (file_size > 0),
uploaded_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
status VARCHAR(50) NOT NULL DEFAULT 'uploaded' CHECK (status IN ('uploaded', 'extracting_text', 'processing_llm', 'generating_pdf', 'completed', 'failed')),
extracted_text TEXT,
generated_summary TEXT,
summary_markdown_path VARCHAR(1000),
summary_pdf_path VARCHAR(1000),
processing_started_at TIMESTAMP WITH TIME ZONE,
processing_completed_at TIMESTAMP WITH TIME ZONE,
error_message TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);
-- Create indexes for better query performance
CREATE INDEX IF NOT EXISTS idx_documents_user_id ON documents(user_id);
CREATE INDEX IF NOT EXISTS idx_documents_status ON documents(status);
CREATE INDEX IF NOT EXISTS idx_documents_uploaded_at ON documents(uploaded_at);
CREATE INDEX IF NOT EXISTS idx_documents_processing_completed_at ON documents(processing_completed_at);
-- Create composite index for user's documents with status
CREATE INDEX IF NOT EXISTS idx_documents_user_status ON documents(user_id, status);
-- Create trigger to update updated_at timestamp
CREATE TRIGGER update_documents_updated_at
BEFORE UPDATE ON documents
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();

View File

@@ -0,0 +1,19 @@
-- Migration: Create document_feedback table
-- Created: 2024-01-01
CREATE TABLE IF NOT EXISTS document_feedback (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
document_id UUID NOT NULL REFERENCES documents(id) ON DELETE CASCADE,
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
feedback TEXT NOT NULL,
regeneration_instructions TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);
-- Create indexes for better query performance
CREATE INDEX IF NOT EXISTS idx_document_feedback_document_id ON document_feedback(document_id);
CREATE INDEX IF NOT EXISTS idx_document_feedback_user_id ON document_feedback(user_id);
CREATE INDEX IF NOT EXISTS idx_document_feedback_created_at ON document_feedback(created_at);
-- Create composite index for document feedback with user
CREATE INDEX IF NOT EXISTS idx_document_feedback_document_user ON document_feedback(document_id, user_id);

View File

@@ -0,0 +1,23 @@
-- Migration: Create document_versions table
-- Created: 2024-01-01
CREATE TABLE IF NOT EXISTS document_versions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
document_id UUID NOT NULL REFERENCES documents(id) ON DELETE CASCADE,
version_number INTEGER NOT NULL,
summary_markdown TEXT NOT NULL,
summary_pdf_path VARCHAR(1000) NOT NULL,
feedback TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
-- Ensure unique version numbers per document
UNIQUE(document_id, version_number)
);
-- Create indexes for better query performance
CREATE INDEX IF NOT EXISTS idx_document_versions_document_id ON document_versions(document_id);
CREATE INDEX IF NOT EXISTS idx_document_versions_version_number ON document_versions(version_number);
CREATE INDEX IF NOT EXISTS idx_document_versions_created_at ON document_versions(created_at);
-- Create composite index for document versions
CREATE INDEX IF NOT EXISTS idx_document_versions_document_version ON document_versions(document_id, version_number);

View File

@@ -0,0 +1,24 @@
-- Migration: Create processing_jobs table
-- Created: 2024-01-01
CREATE TABLE IF NOT EXISTS processing_jobs (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
document_id UUID NOT NULL REFERENCES documents(id) ON DELETE CASCADE,
type VARCHAR(50) NOT NULL CHECK (type IN ('text_extraction', 'llm_processing', 'pdf_generation')),
status VARCHAR(50) NOT NULL DEFAULT 'pending' CHECK (status IN ('pending', 'processing', 'completed', 'failed')),
progress INTEGER NOT NULL DEFAULT 0 CHECK (progress >= 0 AND progress <= 100),
error_message TEXT,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
started_at TIMESTAMP WITH TIME ZONE,
completed_at TIMESTAMP WITH TIME ZONE
);
-- Create indexes for better query performance
CREATE INDEX IF NOT EXISTS idx_processing_jobs_document_id ON processing_jobs(document_id);
CREATE INDEX IF NOT EXISTS idx_processing_jobs_type ON processing_jobs(type);
CREATE INDEX IF NOT EXISTS idx_processing_jobs_status ON processing_jobs(status);
CREATE INDEX IF NOT EXISTS idx_processing_jobs_created_at ON processing_jobs(created_at);
-- Create composite indexes for common queries
CREATE INDEX IF NOT EXISTS idx_processing_jobs_document_status ON processing_jobs(document_id, status);
CREATE INDEX IF NOT EXISTS idx_processing_jobs_type_status ON processing_jobs(type, status);

243
backend/src/models/seed.ts Normal file
View File

@@ -0,0 +1,243 @@
import bcrypt from 'bcryptjs';
import { UserModel } from './UserModel';
import { DocumentModel } from './DocumentModel';
import { ProcessingJobModel } from './ProcessingJobModel';
import logger from '../utils/logger';
import { config } from '../config/env';
import pool from '../config/database';
class DatabaseSeeder {
/**
* Seed the database with initial data
*/
async seed(): Promise<void> {
try {
logger.info('Starting database seeding...');
// Seed users
await this.seedUsers();
// Seed documents (if any users were created)
await this.seedDocuments();
// Seed processing jobs
await this.seedProcessingJobs();
logger.info('Database seeding completed successfully');
} catch (error) {
logger.error('Database seeding failed:', error);
throw error;
}
}
/**
* Seed users
*/
private async seedUsers(): Promise<void> {
const users = [
{
email: 'admin@example.com',
name: 'Admin User',
password: 'admin123',
role: 'admin' as const
},
{
email: 'user1@example.com',
name: 'John Doe',
password: 'user123',
role: 'user' as const
},
{
email: 'user2@example.com',
name: 'Jane Smith',
password: 'user123',
role: 'user' as const
}
];
for (const userData of users) {
try {
// Check if user already exists
const existingUser = await UserModel.findByEmail(userData.email);
if (!existingUser) {
// Hash password
const hashedPassword = await bcrypt.hash(userData.password, config.security.bcryptRounds);
// Create user
await UserModel.create({
...userData,
password: hashedPassword
});
logger.info(`Created user: ${userData.email}`);
} else {
logger.info(`User already exists: ${userData.email}`);
}
} catch (error) {
logger.error(`Error creating user ${userData.email}:`, error);
}
}
}
/**
* Seed documents
*/
private async seedDocuments(): Promise<void> {
try {
// Get a user to associate documents with
const user = await UserModel.findByEmail('user1@example.com');
if (!user) {
logger.warn('No user found for seeding documents');
return;
}
const documents = [
{
user_id: user.id,
original_file_name: 'sample_cim_1.pdf',
file_path: '/uploads/sample_cim_1.pdf',
file_size: 2048576, // 2MB
status: 'completed' as const
},
{
user_id: user.id,
original_file_name: 'sample_cim_2.pdf',
file_path: '/uploads/sample_cim_2.pdf',
file_size: 3145728, // 3MB
status: 'processing_llm' as const
},
{
user_id: user.id,
original_file_name: 'sample_cim_3.pdf',
file_path: '/uploads/sample_cim_3.pdf',
file_size: 1048576, // 1MB
status: 'uploaded' as const
}
];
for (const docData of documents) {
try {
// Check if document already exists (by file path)
const existingDocs = await DocumentModel.findByUserId(user.id);
const exists = existingDocs.some(doc => doc.file_path === docData.file_path);
if (!exists) {
await DocumentModel.create(docData);
logger.info(`Created document: ${docData.original_file_name}`);
} else {
logger.info(`Document already exists: ${docData.original_file_name}`);
}
} catch (error) {
logger.error(`Error creating document ${docData.original_file_name}:`, error);
}
}
} catch (error) {
logger.error('Error seeding documents:', error);
}
}
/**
* Seed processing jobs
*/
private async seedProcessingJobs(): Promise<void> {
try {
// Get a document to associate jobs with
const user = await UserModel.findByEmail('user1@example.com');
if (!user) {
logger.warn('No user found for seeding processing jobs');
return;
}
const documents = await DocumentModel.findByUserId(user.id);
if (documents.length === 0) {
logger.warn('No documents found for seeding processing jobs');
return;
}
const document = documents[0]; // Use first document
if (!document) {
logger.warn('No document found for seeding processing jobs');
return;
}
const jobs = [
{
document_id: document.id,
type: 'text_extraction' as const,
status: 'completed' as const,
progress: 100
},
{
document_id: document.id,
type: 'llm_processing' as const,
status: 'processing' as const,
progress: 75
},
{
document_id: document.id,
type: 'pdf_generation' as const,
status: 'pending' as const,
progress: 0
}
];
for (const jobData of jobs) {
try {
// Check if job already exists
const existingJobs = await ProcessingJobModel.findByDocumentId(document.id);
const exists = existingJobs.some(job => job.type === jobData.type);
if (!exists) {
const job = await ProcessingJobModel.create({
document_id: jobData.document_id,
type: jobData.type
});
// Update status and progress
await ProcessingJobModel.updateStatus(job.id, jobData.status);
await ProcessingJobModel.updateProgress(job.id, jobData.progress);
logger.info(`Created processing job: ${jobData.type}`);
} else {
logger.info(`Processing job already exists: ${jobData.type}`);
}
} catch (error) {
logger.error(`Error creating processing job ${jobData.type}:`, error);
}
}
} catch (error) {
logger.error('Error seeding processing jobs:', error);
}
}
/**
* Clear all seeded data
*/
async clear(): Promise<void> {
try {
logger.info('Clearing seeded data...');
// Clear in reverse order to respect foreign key constraints
await pool.query('DELETE FROM processing_jobs');
await pool.query('DELETE FROM document_versions');
await pool.query('DELETE FROM document_feedback');
await pool.query('DELETE FROM documents');
await pool.query('DELETE FROM users WHERE email IN ($1, $2, $3)', [
'admin@example.com',
'user1@example.com',
'user2@example.com'
]);
logger.info('Seeded data cleared successfully');
} catch (error) {
logger.error('Error clearing seeded data:', error);
throw error;
}
}
}
export default DatabaseSeeder;

120
backend/src/models/types.ts Normal file
View File

@@ -0,0 +1,120 @@
// Database model interfaces
export interface User {
id: string;
email: string;
name: string;
password_hash: string;
role: 'user' | 'admin';
created_at: Date;
updated_at: Date;
last_login?: Date;
is_active: boolean;
}
export interface Document {
id: string;
user_id: string;
original_file_name: string;
file_path: string;
file_size: number;
uploaded_at: Date;
status: ProcessingStatus;
extracted_text?: string;
generated_summary?: string;
summary_markdown_path?: string;
summary_pdf_path?: string;
processing_started_at?: Date;
processing_completed_at?: Date;
error_message?: string;
created_at: Date;
updated_at: Date;
}
export interface DocumentFeedback {
id: string;
document_id: string;
user_id: string;
feedback: string;
regeneration_instructions?: string;
created_at: Date;
}
export interface DocumentVersion {
id: string;
document_id: string;
version_number: number;
summary_markdown: string;
summary_pdf_path: string;
feedback?: string;
created_at: Date;
}
export interface ProcessingJob {
id: string;
document_id: string;
type: JobType;
status: JobStatus;
progress: number;
error_message?: string;
created_at: Date;
started_at?: Date;
completed_at?: Date;
}
export type ProcessingStatus =
| 'uploaded'
| 'extracting_text'
| 'processing_llm'
| 'generating_pdf'
| 'completed'
| 'failed';
export type JobType = 'text_extraction' | 'llm_processing' | 'pdf_generation';
export type JobStatus = 'pending' | 'processing' | 'completed' | 'failed';
// Database query result types
export interface UserWithDocuments extends User {
documents?: Document[];
}
export interface DocumentWithFeedback extends Document {
feedback?: DocumentFeedback[];
versions?: DocumentVersion[];
processing_jobs?: ProcessingJob[];
}
// Input types for creating/updating records
export interface CreateUserInput {
email: string;
name: string;
password: string;
role?: 'user' | 'admin';
}
export interface CreateDocumentInput {
user_id: string;
original_file_name: string;
file_path: string;
file_size: number;
}
export interface CreateDocumentFeedbackInput {
document_id: string;
user_id: string;
feedback: string;
regeneration_instructions?: string;
}
export interface CreateDocumentVersionInput {
document_id: string;
version_number: number;
summary_markdown: string;
summary_pdf_path: string;
feedback?: string;
}
export interface CreateProcessingJobInput {
document_id: string;
type: JobType;
}

View File

@@ -0,0 +1,59 @@
import { Router } from 'express';
import {
register,
login,
logout,
refreshToken,
getProfile,
updateProfile
} from '../controllers/authController';
import {
authenticateToken,
authRateLimit
} from '../middleware/auth';
const router = Router();
/**
* @route POST /api/auth/register
* @desc Register a new user
* @access Public
*/
router.post('/register', authRateLimit, register);
/**
* @route POST /api/auth/login
* @desc Login user
* @access Public
*/
router.post('/login', authRateLimit, login);
/**
* @route POST /api/auth/logout
* @desc Logout user
* @access Private
*/
router.post('/logout', authenticateToken, logout);
/**
* @route POST /api/auth/refresh
* @desc Refresh access token
* @access Public
*/
router.post('/refresh', authRateLimit, refreshToken);
/**
* @route GET /api/auth/profile
* @desc Get current user profile
* @access Private
*/
router.get('/profile', authenticateToken, getProfile);
/**
* @route PUT /api/auth/profile
* @desc Update current user profile
* @access Private
*/
router.put('/profile', authenticateToken, updateProfile);
export default router;

View File

@@ -0,0 +1,146 @@
import { Router } from 'express';
import { auth } from '../middleware/auth';
import { validateDocumentUpload } from '../middleware/validation';
const router = Router();
// Apply authentication middleware to all document routes
router.use(auth);
// GET /api/documents - Get all documents for the authenticated user
router.get('/', async (_req, res, next) => {
try {
// TODO: Implement document listing
res.json({
success: true,
data: [],
message: 'Documents retrieved successfully',
});
} catch (error) {
next(error);
}
});
// GET /api/documents/:id - Get a specific document
router.get('/:id', async (req, res, next) => {
try {
const { id: _id } = req.params;
// TODO: Implement document retrieval
res.json({
success: true,
data: null,
message: 'Document retrieved successfully',
});
} catch (error) {
next(error);
}
});
// POST /api/documents - Upload and process a new document
router.post('/', validateDocumentUpload, async (_req, res, next) => {
try {
// TODO: Implement document upload and processing
res.status(201).json({
success: true,
data: {
id: 'temp-id',
status: 'uploaded',
},
message: 'Document uploaded successfully',
});
} catch (error) {
next(error);
}
});
// GET /api/documents/:id/download - Download processed document
router.get('/:id/download', async (req, res, next) => {
try {
const { id: _id } = req.params;
const { format: _format = 'pdf' } = req.query;
// TODO: Implement document download
res.json({
success: true,
data: {
downloadUrl: `/api/documents/${_id}/file`,
format: _format,
},
message: 'Download link generated successfully',
});
} catch (error) {
next(error);
}
});
// GET /api/documents/:id/file - Stream document file
router.get('/:id/file', async (req, res, next) => {
try {
const { id: _id } = req.params;
const { format: _format = 'pdf' } = req.query;
// TODO: Implement file streaming
res.status(404).json({
success: false,
error: 'File not found',
});
} catch (error) {
next(error);
}
});
// POST /api/documents/:id/feedback - Submit feedback for document regeneration
router.post('/:id/feedback', async (req, res, next) => {
try {
const { id: _id } = req.params;
const { feedback: _feedback } = req.body;
// TODO: Implement feedback submission
res.json({
success: true,
data: {
feedbackId: 'temp-feedback-id',
},
message: 'Feedback submitted successfully',
});
} catch (error) {
next(error);
}
});
// POST /api/documents/:id/regenerate - Regenerate document with feedback
router.post('/:id/regenerate', async (req, res, next) => {
try {
const { id: _id } = req.params;
const { feedbackId: _feedbackId } = req.body;
// TODO: Implement document regeneration
res.json({
success: true,
data: {
jobId: 'temp-job-id',
status: 'processing',
},
message: 'Document regeneration started',
});
} catch (error) {
next(error);
}
});
// DELETE /api/documents/:id - Delete a document
router.delete('/:id', async (req, res, next) => {
try {
const { id: _id } = req.params;
// TODO: Implement document deletion
res.json({
success: true,
message: 'Document deleted successfully',
});
} catch (error) {
next(error);
}
});
export default router;

View File

@@ -0,0 +1,32 @@
#!/usr/bin/env ts-node
import DatabaseMigrator from '../models/migrate';
import DatabaseSeeder from '../models/seed';
import logger from '../utils/logger';
async function setupDatabase() {
try {
logger.info('Starting database setup...');
// Run migrations
const migrator = new DatabaseMigrator();
await migrator.migrate();
// Seed database
const seeder = new DatabaseSeeder();
await seeder.seed();
logger.info('Database setup completed successfully');
process.exit(0);
} catch (error) {
logger.error('Database setup failed:', error);
process.exit(1);
}
}
// Run if called directly
if (require.main === module) {
setupDatabase();
}
export default setupDatabase;

View File

@@ -0,0 +1,313 @@
import Redis from 'redis';
import { config } from '../config/env';
import logger from '../utils/logger';
export interface SessionData {
userId: string;
email: string;
role: string;
refreshToken: string;
lastActivity: number;
}
class SessionService {
private client: Redis.RedisClientType;
private isConnected: boolean = false;
constructor() {
this.client = Redis.createClient({
url: config.redis.url,
socket: {
host: config.redis.host,
port: config.redis.port,
reconnectStrategy: (retries) => {
if (retries > 10) {
logger.error('Redis connection failed after 10 retries');
return new Error('Redis connection failed');
}
return Math.min(retries * 100, 3000);
}
}
});
this.setupEventHandlers();
}
private setupEventHandlers(): void {
this.client.on('connect', () => {
logger.info('Connected to Redis');
this.isConnected = true;
});
this.client.on('ready', () => {
logger.info('Redis client ready');
});
this.client.on('error', (error) => {
logger.error('Redis client error:', error);
this.isConnected = false;
});
this.client.on('end', () => {
logger.info('Redis connection ended');
this.isConnected = false;
});
this.client.on('reconnecting', () => {
logger.info('Reconnecting to Redis...');
});
}
/**
* Connect to Redis
*/
async connect(): Promise<void> {
if (this.isConnected) {
return;
}
try {
await this.client.connect();
logger.info('Successfully connected to Redis');
} catch (error) {
logger.error('Failed to connect to Redis:', error);
throw error;
}
}
/**
* Disconnect from Redis
*/
async disconnect(): Promise<void> {
if (!this.isConnected) {
return;
}
try {
await this.client.quit();
logger.info('Disconnected from Redis');
} catch (error) {
logger.error('Error disconnecting from Redis:', error);
}
}
/**
* Store user session
*/
async storeSession(userId: string, sessionData: Omit<SessionData, 'lastActivity'>): Promise<void> {
try {
await this.connect();
const session: SessionData = {
...sessionData,
lastActivity: Date.now()
};
const key = `session:${userId}`;
const sessionTTL = parseInt(config.jwt.refreshExpiresIn.replace(/[^0-9]/g, '')) *
(config.jwt.refreshExpiresIn.includes('h') ? 3600 :
config.jwt.refreshExpiresIn.includes('d') ? 86400 : 60);
await this.client.setEx(key, sessionTTL, JSON.stringify(session));
logger.info(`Stored session for user: ${userId}`);
} catch (error) {
logger.error('Error storing session:', error);
throw new Error('Failed to store session');
}
}
/**
* Get user session
*/
async getSession(userId: string): Promise<SessionData | null> {
try {
await this.connect();
const key = `session:${userId}`;
const sessionData = await this.client.get(key);
if (!sessionData) {
return null;
}
const session: SessionData = JSON.parse(sessionData);
// Update last activity
session.lastActivity = Date.now();
await this.updateSessionActivity(userId, session.lastActivity);
logger.info(`Retrieved session for user: ${userId}`);
return session;
} catch (error) {
logger.error('Error getting session:', error);
return null;
}
}
/**
* Update session activity timestamp
*/
async updateSessionActivity(userId: string, lastActivity: number): Promise<void> {
try {
await this.connect();
const key = `session:${userId}`;
const sessionData = await this.client.get(key);
if (sessionData) {
const session: SessionData = JSON.parse(sessionData);
session.lastActivity = lastActivity;
const sessionTTL = parseInt(config.jwt.refreshExpiresIn.replace(/[^0-9]/g, '')) *
(config.jwt.refreshExpiresIn.includes('h') ? 3600 :
config.jwt.refreshExpiresIn.includes('d') ? 86400 : 60);
await this.client.setEx(key, sessionTTL, JSON.stringify(session));
}
} catch (error) {
logger.error('Error updating session activity:', error);
}
}
/**
* Remove user session
*/
async removeSession(userId: string): Promise<void> {
try {
await this.connect();
const key = `session:${userId}`;
await this.client.del(key);
logger.info(`Removed session for user: ${userId}`);
} catch (error) {
logger.error('Error removing session:', error);
throw new Error('Failed to remove session');
}
}
/**
* Check if session exists
*/
async sessionExists(userId: string): Promise<boolean> {
try {
await this.connect();
const key = `session:${userId}`;
const exists = await this.client.exists(key);
return exists === 1;
} catch (error) {
logger.error('Error checking session existence:', error);
return false;
}
}
/**
* Store refresh token for blacklisting
*/
async blacklistToken(token: string, expiresIn: number): Promise<void> {
try {
await this.connect();
const key = `blacklist:${token}`;
await this.client.setEx(key, expiresIn, '1');
logger.info('Token blacklisted successfully');
} catch (error) {
logger.error('Error blacklisting token:', error);
throw new Error('Failed to blacklist token');
}
}
/**
* Check if token is blacklisted
*/
async isTokenBlacklisted(token: string): Promise<boolean> {
try {
await this.connect();
const key = `blacklist:${token}`;
const exists = await this.client.exists(key);
return exists === 1;
} catch (error) {
logger.error('Error checking token blacklist:', error);
return false;
}
}
/**
* Get all active sessions (for admin)
*/
async getAllSessions(): Promise<{ userId: string; session: SessionData }[]> {
try {
await this.connect();
const keys = await this.client.keys('session:*');
const sessions: { userId: string; session: SessionData }[] = [];
for (const key of keys) {
const userId = key.replace('session:', '');
const sessionData = await this.client.get(key);
if (sessionData) {
sessions.push({
userId,
session: JSON.parse(sessionData)
});
}
}
return sessions;
} catch (error) {
logger.error('Error getting all sessions:', error);
return [];
}
}
/**
* Clean up expired sessions
*/
async cleanupExpiredSessions(): Promise<number> {
try {
await this.connect();
const keys = await this.client.keys('session:*');
let cleanedCount = 0;
for (const key of keys) {
const sessionData = await this.client.get(key);
if (sessionData) {
const session: SessionData = JSON.parse(sessionData);
const now = Date.now();
const sessionTTL = parseInt(config.jwt.refreshExpiresIn.replace(/[^0-9]/g, '')) *
(config.jwt.refreshExpiresIn.includes('h') ? 3600 :
config.jwt.refreshExpiresIn.includes('d') ? 86400 : 60) * 1000;
if (now - session.lastActivity > sessionTTL) {
await this.client.del(key);
cleanedCount++;
}
}
}
logger.info(`Cleaned up ${cleanedCount} expired sessions`);
return cleanedCount;
} catch (error) {
logger.error('Error cleaning up expired sessions:', error);
return 0;
}
}
/**
* Get Redis connection status
*/
getConnectionStatus(): boolean {
return this.isConnected;
}
}
// Export singleton instance
export const sessionService = new SessionService();

View File

@@ -0,0 +1,92 @@
import request from 'supertest';
import app from '../index';
describe('Server Setup', () => {
describe('Health Check', () => {
it('should return 200 for health check endpoint', async () => {
const response = await request(app).get('/health');
expect(response.status).toBe(200);
expect(response.body).toHaveProperty('status', 'ok');
expect(response.body).toHaveProperty('timestamp');
expect(response.body).toHaveProperty('uptime');
expect(response.body).toHaveProperty('environment');
});
});
describe('API Root', () => {
it('should return API information', async () => {
const response = await request(app).get('/api');
expect(response.status).toBe(200);
expect(response.body).toHaveProperty('message', 'CIM Document Processor API');
expect(response.body).toHaveProperty('version', '1.0.0');
expect(response.body).toHaveProperty('endpoints');
expect(response.body.endpoints).toHaveProperty('auth');
expect(response.body.endpoints).toHaveProperty('documents');
expect(response.body.endpoints).toHaveProperty('health');
});
});
describe('Authentication Routes', () => {
it('should have auth routes mounted', async () => {
const response = await request(app).post('/api/auth/login');
// Should not return 404 (route exists)
expect(response.status).not.toBe(404);
});
});
describe('Document Routes', () => {
it('should have document routes mounted', async () => {
const response = await request(app).get('/api/documents');
// Should return 401 (unauthorized) rather than 404 (not found)
// This indicates the route exists but requires authentication
expect(response.status).toBe(401);
});
});
describe('404 Handler', () => {
it('should return 404 for non-existent routes', async () => {
const response = await request(app).get('/api/nonexistent');
expect(response.status).toBe(404);
expect(response.body).toHaveProperty('success', false);
expect(response.body).toHaveProperty('error');
expect(response.body).toHaveProperty('message');
});
});
describe('CORS', () => {
it('should include CORS headers', async () => {
const response = await request(app)
.options('/api')
.set('Origin', 'http://localhost:3000');
expect(response.headers).toHaveProperty('access-control-allow-origin');
expect(response.headers).toHaveProperty('access-control-allow-methods');
expect(response.headers).toHaveProperty('access-control-allow-headers');
});
});
describe('Security Headers', () => {
it('should include security headers', async () => {
const response = await request(app).get('/health');
expect(response.headers).toHaveProperty('x-frame-options');
expect(response.headers).toHaveProperty('x-content-type-options');
expect(response.headers).toHaveProperty('x-xss-protection');
});
});
describe('Rate Limiting', () => {
it('should include rate limit headers', async () => {
const response = await request(app).get('/health');
expect(response.headers).toHaveProperty('ratelimit-limit');
expect(response.headers).toHaveProperty('ratelimit-remaining');
expect(response.headers).toHaveProperty('ratelimit-reset');
});
});
});

89
backend/src/test/setup.ts Normal file
View File

@@ -0,0 +1,89 @@
// Jest test setup file
// Mock Redis
jest.mock('redis', () => ({
createClient: jest.fn(() => ({
connect: jest.fn().mockResolvedValue(undefined),
disconnect: jest.fn().mockResolvedValue(undefined),
quit: jest.fn().mockResolvedValue(undefined),
on: jest.fn(),
get: jest.fn().mockResolvedValue(null),
set: jest.fn().mockResolvedValue('OK'),
del: jest.fn().mockResolvedValue(1),
exists: jest.fn().mockResolvedValue(0),
keys: jest.fn().mockResolvedValue([]),
scan: jest.fn().mockResolvedValue(['0', []]),
expire: jest.fn().mockResolvedValue(1),
ttl: jest.fn().mockResolvedValue(-1)
}))
}));
// Mock environment variables for testing
(process.env as any).NODE_ENV = 'test';
(process.env as any).JWT_SECRET = 'test-jwt-secret';
(process.env as any).JWT_REFRESH_SECRET = 'test-refresh-secret';
(process.env as any).DATABASE_URL = 'postgresql://test:test@localhost:5432/test_db';
(process.env as any).DB_HOST = 'localhost';
(process.env as any).DB_PORT = '5432';
(process.env as any).DB_NAME = 'test_db';
(process.env as any).DB_USER = 'test';
(process.env as any).DB_PASSWORD = 'test';
(process.env as any).REDIS_URL = 'redis://localhost:6379';
(process.env as any).LLM_PROVIDER = 'anthropic';
(process.env as any).ANTHROPIC_API_KEY = 'dummy_key';
// Global test timeout
jest.setTimeout(10000);
// Suppress console logs during tests unless there's an error
const originalConsoleLog = console.log;
const originalConsoleInfo = console.info;
const originalConsoleWarn = console.warn;
beforeAll(() => {
console.log = jest.fn();
console.info = jest.fn();
console.warn = jest.fn();
});
afterAll(() => {
console.log = originalConsoleLog;
console.info = originalConsoleInfo;
console.warn = originalConsoleWarn;
});
// Global test utilities
(global as any).testUtils = {
// Helper to create mock database results
createMockDbResult: (data: any) => ({
rows: Array.isArray(data) ? data : [data],
rowCount: Array.isArray(data) ? data.length : 1
}),
// Helper to create mock user data
createMockUser: (overrides = {}) => ({
id: '123e4567-e89b-12d3-a456-426614174000',
email: 'test@example.com',
name: 'Test User',
password_hash: 'hashed_password',
role: 'user',
created_at: new Date(),
updated_at: new Date(),
is_active: true,
...overrides
}),
// Helper to create mock document data
createMockDocument: (overrides = {}) => ({
id: '123e4567-e89b-12d3-a456-426614174001',
user_id: '123e4567-e89b-12d3-a456-426614174000',
original_file_name: 'test.pdf',
file_path: '/uploads/test.pdf',
file_size: 1024000,
uploaded_at: new Date(),
status: 'uploaded',
created_at: new Date(),
updated_at: new Date(),
...overrides
})
};

View File

@@ -0,0 +1,305 @@
import {
generateAccessToken,
generateRefreshToken,
generateAuthTokens,
verifyAccessToken,
verifyRefreshToken,
hashPassword,
comparePassword,
validatePassword,
extractTokenFromHeader,
decodeToken
} from '../auth';
// Config is mocked below, so we don't need to import it
// Mock the config
jest.mock('../../config/env', () => ({
config: {
jwt: {
secret: 'test-secret',
refreshSecret: 'test-refresh-secret',
expiresIn: '1h',
refreshExpiresIn: '7d'
},
security: {
bcryptRounds: 10
}
}
}));
// Mock logger
jest.mock('../logger', () => ({
info: jest.fn(),
error: jest.fn()
}));
describe('Auth Utilities', () => {
const mockPayload = {
userId: '123e4567-e89b-12d3-a456-426614174000',
email: 'test@example.com',
role: 'user'
};
describe('generateAccessToken', () => {
it('should generate a valid access token', () => {
const token = generateAccessToken(mockPayload);
expect(token).toBeDefined();
expect(typeof token).toBe('string');
expect(token.split('.')).toHaveLength(3); // JWT has 3 parts
});
it('should include the correct payload in the token', () => {
const token = generateAccessToken(mockPayload);
const decoded = decodeToken(token);
expect(decoded).toMatchObject({
userId: mockPayload.userId,
email: mockPayload.email,
role: mockPayload.role,
iss: 'cim-processor',
aud: 'cim-processor-users'
});
});
});
describe('generateRefreshToken', () => {
it('should generate a valid refresh token', () => {
const token = generateRefreshToken(mockPayload);
expect(token).toBeDefined();
expect(typeof token).toBe('string');
expect(token.split('.')).toHaveLength(3);
});
it('should use refresh secret for signing', () => {
const token = generateRefreshToken(mockPayload);
const decoded = decodeToken(token);
expect(decoded).toMatchObject({
userId: mockPayload.userId,
email: mockPayload.email,
role: mockPayload.role
});
});
});
describe('generateAuthTokens', () => {
it('should generate both access and refresh tokens', () => {
const tokens = generateAuthTokens(mockPayload);
expect(tokens).toHaveProperty('accessToken');
expect(tokens).toHaveProperty('refreshToken');
expect(tokens).toHaveProperty('expiresIn');
expect(typeof tokens.accessToken).toBe('string');
expect(typeof tokens.refreshToken).toBe('string');
expect(typeof tokens.expiresIn).toBe('number');
});
it('should calculate correct expiration time', () => {
const tokens = generateAuthTokens(mockPayload);
// 1h = 3600 seconds
expect(tokens.expiresIn).toBe(3600);
});
});
describe('verifyAccessToken', () => {
it('should verify a valid access token', () => {
const token = generateAccessToken(mockPayload);
const decoded = verifyAccessToken(token);
expect(decoded).toMatchObject({
userId: mockPayload.userId,
email: mockPayload.email,
role: mockPayload.role
});
});
it('should throw error for invalid token', () => {
expect(() => {
verifyAccessToken('invalid-token');
}).toThrow('Invalid or expired access token');
});
it('should throw error for token signed with wrong secret', () => {
const token = generateRefreshToken(mockPayload); // Uses refresh secret
expect(() => {
verifyAccessToken(token); // Expects access secret
}).toThrow('Invalid or expired access token');
});
});
describe('verifyRefreshToken', () => {
it('should verify a valid refresh token', () => {
const token = generateRefreshToken(mockPayload);
const decoded = verifyRefreshToken(token);
expect(decoded).toMatchObject({
userId: mockPayload.userId,
email: mockPayload.email,
role: mockPayload.role
});
});
it('should throw error for invalid refresh token', () => {
expect(() => {
verifyRefreshToken('invalid-token');
}).toThrow('Invalid or expired refresh token');
});
});
describe('hashPassword', () => {
it('should hash password correctly', async () => {
const password = 'TestPassword123!';
const hashedPassword = await hashPassword(password);
expect(hashedPassword).toBeDefined();
expect(typeof hashedPassword).toBe('string');
expect(hashedPassword).not.toBe(password);
expect(hashedPassword.startsWith('$2a$') || hashedPassword.startsWith('$2b$')).toBe(true); // bcrypt format
});
it('should generate different hashes for same password', async () => {
const password = 'TestPassword123!';
const hash1 = await hashPassword(password);
const hash2 = await hashPassword(password);
expect(hash1).not.toBe(hash2);
});
});
describe('comparePassword', () => {
it('should return true for correct password', async () => {
const password = 'TestPassword123!';
const hashedPassword = await hashPassword(password);
const isMatch = await comparePassword(password, hashedPassword);
expect(isMatch).toBe(true);
});
it('should return false for incorrect password', async () => {
const password = 'TestPassword123!';
const wrongPassword = 'WrongPassword123!';
const hashedPassword = await hashPassword(password);
const isMatch = await comparePassword(wrongPassword, hashedPassword);
expect(isMatch).toBe(false);
});
});
describe('validatePassword', () => {
it('should validate a strong password', () => {
const password = 'StrongPass123!';
const result = validatePassword(password);
expect(result.isValid).toBe(true);
expect(result.errors).toHaveLength(0);
});
it('should reject password that is too short', () => {
const password = 'Short1!';
const result = validatePassword(password);
expect(result.isValid).toBe(false);
expect(result.errors).toContain('Password must be at least 8 characters long');
});
it('should reject password without uppercase letter', () => {
const password = 'lowercase123!';
const result = validatePassword(password);
expect(result.isValid).toBe(false);
expect(result.errors).toContain('Password must contain at least one uppercase letter');
});
it('should reject password without lowercase letter', () => {
const password = 'UPPERCASE123!';
const result = validatePassword(password);
expect(result.isValid).toBe(false);
expect(result.errors).toContain('Password must contain at least one lowercase letter');
});
it('should reject password without number', () => {
const password = 'NoNumbers!';
const result = validatePassword(password);
expect(result.isValid).toBe(false);
expect(result.errors).toContain('Password must contain at least one number');
});
it('should reject password without special character', () => {
const password = 'NoSpecialChar123';
const result = validatePassword(password);
expect(result.isValid).toBe(false);
expect(result.errors).toContain('Password must contain at least one special character');
});
it('should return all validation errors for weak password', () => {
const password = 'weak';
const result = validatePassword(password);
expect(result.isValid).toBe(false);
expect(result.errors).toHaveLength(4); // 'weak' has lowercase, so only 4 errors
expect(result.errors).toContain('Password must be at least 8 characters long');
expect(result.errors).toContain('Password must contain at least one uppercase letter');
expect(result.errors).toContain('Password must contain at least one number');
expect(result.errors).toContain('Password must contain at least one special character');
});
});
describe('extractTokenFromHeader', () => {
it('should extract token from valid Authorization header', () => {
const header = 'Bearer valid-token-here';
const token = extractTokenFromHeader(header);
expect(token).toBe('valid-token-here');
});
it('should return null for missing header', () => {
const token = extractTokenFromHeader(undefined);
expect(token).toBeNull();
});
it('should return null for empty header', () => {
const token = extractTokenFromHeader('');
expect(token).toBeNull();
});
it('should return null for invalid format', () => {
const token = extractTokenFromHeader('InvalidFormat token');
expect(token).toBeNull();
});
it('should return null for missing token part', () => {
const token = extractTokenFromHeader('Bearer ');
expect(token).toBeNull();
});
});
describe('decodeToken', () => {
it('should decode a valid token', () => {
const token = generateAccessToken(mockPayload);
const decoded = decodeToken(token);
expect(decoded).toMatchObject({
userId: mockPayload.userId,
email: mockPayload.email,
role: mockPayload.role
});
});
it('should return null for invalid token', () => {
const decoded = decodeToken('invalid-token');
expect(decoded).toBeNull();
});
});
});

199
backend/src/utils/auth.ts Normal file
View File

@@ -0,0 +1,199 @@
import jwt from 'jsonwebtoken';
import bcrypt from 'bcryptjs';
import { config } from '../config/env';
import logger from './logger';
export interface JWTPayload {
userId: string;
email: string;
role: string;
iat?: number;
exp?: number;
}
export interface AuthTokens {
accessToken: string;
refreshToken: string;
expiresIn: number;
}
/**
* Generate JWT access token
*/
export function generateAccessToken(payload: Omit<JWTPayload, 'iat' | 'exp'>): string {
try {
const token = jwt.sign(payload, config.jwt.secret, {
expiresIn: config.jwt.expiresIn,
issuer: 'cim-processor',
audience: 'cim-processor-users'
});
logger.info(`Generated access token for user: ${payload.email}`);
return token;
} catch (error) {
logger.error('Error generating access token:', error);
throw new Error('Failed to generate access token');
}
}
/**
* Generate JWT refresh token
*/
export function generateRefreshToken(payload: Omit<JWTPayload, 'iat' | 'exp'>): string {
try {
const token = jwt.sign(payload, config.jwt.refreshSecret, {
expiresIn: config.jwt.refreshExpiresIn,
issuer: 'cim-processor',
audience: 'cim-processor-users'
});
logger.info(`Generated refresh token for user: ${payload.email}`);
return token;
} catch (error) {
logger.error('Error generating refresh token:', error);
throw new Error('Failed to generate refresh token');
}
}
/**
* Generate both access and refresh tokens
*/
export function generateAuthTokens(payload: Omit<JWTPayload, 'iat' | 'exp'>): AuthTokens {
const accessToken = generateAccessToken(payload);
const refreshToken = generateRefreshToken(payload);
// Calculate expiration time in seconds
const expiresIn = parseInt(config.jwt.expiresIn.replace(/[^0-9]/g, '')) *
(config.jwt.expiresIn.includes('h') ? 3600 :
config.jwt.expiresIn.includes('d') ? 86400 : 60);
return {
accessToken,
refreshToken,
expiresIn
};
}
/**
* Verify JWT access token
*/
export function verifyAccessToken(token: string): JWTPayload {
try {
const decoded = jwt.verify(token, config.jwt.secret, {
issuer: 'cim-processor',
audience: 'cim-processor-users'
}) as JWTPayload;
logger.info(`Verified access token for user: ${decoded.email}`);
return decoded;
} catch (error) {
logger.error('Error verifying access token:', error);
throw new Error('Invalid or expired access token');
}
}
/**
* Verify JWT refresh token
*/
export function verifyRefreshToken(token: string): JWTPayload {
try {
const decoded = jwt.verify(token, config.jwt.refreshSecret, {
issuer: 'cim-processor',
audience: 'cim-processor-users'
}) as JWTPayload;
logger.info(`Verified refresh token for user: ${decoded.email}`);
return decoded;
} catch (error) {
logger.error('Error verifying refresh token:', error);
throw new Error('Invalid or expired refresh token');
}
}
/**
* Hash password using bcrypt
*/
export async function hashPassword(password: string): Promise<string> {
try {
const hashedPassword = await bcrypt.hash(password, config.security.bcryptRounds);
logger.info('Password hashed successfully');
return hashedPassword;
} catch (error) {
logger.error('Error hashing password:', error);
throw new Error('Failed to hash password');
}
}
/**
* Compare password with hashed password
*/
export async function comparePassword(password: string, hashedPassword: string): Promise<boolean> {
try {
const isMatch = await bcrypt.compare(password, hashedPassword);
logger.info(`Password comparison result: ${isMatch}`);
return isMatch;
} catch (error) {
logger.error('Error comparing passwords:', error);
throw new Error('Failed to compare passwords');
}
}
/**
* Validate password strength
*/
export function validatePassword(password: string): { isValid: boolean; errors: string[] } {
const errors: string[] = [];
if (password.length < 8) {
errors.push('Password must be at least 8 characters long');
}
if (!/[A-Z]/.test(password)) {
errors.push('Password must contain at least one uppercase letter');
}
if (!/[a-z]/.test(password)) {
errors.push('Password must contain at least one lowercase letter');
}
if (!/[0-9]/.test(password)) {
errors.push('Password must contain at least one number');
}
if (!/[!@#$%^&*(),.?":{}|<>]/.test(password)) {
errors.push('Password must contain at least one special character');
}
return {
isValid: errors.length === 0,
errors
};
}
/**
* Extract token from Authorization header
*/
export function extractTokenFromHeader(authHeader: string | undefined): string | null {
if (!authHeader) {
return null;
}
const parts = authHeader.split(' ');
if (parts.length !== 2 || parts[0] !== 'Bearer') {
return null;
}
return parts[1] || null;
}
/**
* Decode JWT token without verification (for debugging)
*/
export function decodeToken(token: string): any {
try {
return jwt.decode(token);
} catch (error) {
logger.error('Error decoding token:', error);
return null;
}
}

View File

@@ -0,0 +1,46 @@
import winston from 'winston';
import { config } from '../config/env';
import path from 'path';
// Create logs directory if it doesn't exist
import fs from 'fs';
const logsDir = path.dirname(config.logging.file);
if (!fs.existsSync(logsDir)) {
fs.mkdirSync(logsDir, { recursive: true });
}
// Define log format
const logFormat = winston.format.combine(
winston.format.timestamp(),
winston.format.errors({ stack: true }),
winston.format.json()
);
// Create logger instance
export const logger = winston.createLogger({
level: config.logging.level,
format: logFormat,
transports: [
// Write all logs with level 'error' and below to error.log
new winston.transports.File({
filename: path.join(logsDir, 'error.log'),
level: 'error',
}),
// Write all logs with level 'info' and below to combined.log
new winston.transports.File({
filename: config.logging.file,
}),
],
});
// If we're not in production, log to the console as well
if (config.env !== 'production') {
logger.add(new winston.transports.Console({
format: winston.format.combine(
winston.format.colorize(),
winston.format.simple()
),
}));
}
export default logger;

33
backend/tsconfig.json Normal file
View File

@@ -0,0 +1,33 @@
{
"compilerOptions": {
"target": "ES2020",
"module": "commonjs",
"lib": ["ES2020"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true,
"removeComments": true,
"noImplicitAny": true,
"noImplicitReturns": true,
"noImplicitThis": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"exactOptionalPropertyTypes": true,
"noImplicitOverride": true,
"noPropertyAccessFromIndexSignature": true,
"noUncheckedIndexedAccess": true,
"baseUrl": ".",
"paths": {
"@/*": ["./src/*"]
}
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist", "**/*.test.ts", "**/*.spec.ts"]
}

5
frontend/.env.example Normal file
View File

@@ -0,0 +1,5 @@
# Frontend Environment Variables
VITE_API_BASE_URL=http://localhost:5000/api
VITE_APP_NAME=CIM Document Processor
VITE_MAX_FILE_SIZE=104857600
VITE_ALLOWED_FILE_TYPES=application/pdf

13
frontend/index.html Normal file
View File

@@ -0,0 +1,13 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>CIM Document Processor</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/main.tsx"></script>
</body>
</html>

6705
frontend/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

46
frontend/package.json Normal file
View File

@@ -0,0 +1,46 @@
{
"name": "cim-processor-frontend",
"private": true,
"version": "0.0.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "tsc && vite build",
"lint": "eslint . --ext ts,tsx --report-unused-disable-directives --max-warnings 0",
"preview": "vite preview",
"test": "vitest --run",
"test:watch": "vitest"
},
"dependencies": {
"@tanstack/react-query": "^5.8.4",
"axios": "^1.6.2",
"clsx": "^2.0.0",
"lucide-react": "^0.294.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-dropzone": "^14.2.3",
"react-hook-form": "^7.48.2",
"react-router-dom": "^6.20.1",
"tailwind-merge": "^2.0.0"
},
"devDependencies": {
"@testing-library/jest-dom": "^6.1.4",
"@testing-library/react": "^13.4.0",
"@testing-library/user-event": "^14.5.1",
"@types/react": "^18.2.37",
"@types/react-dom": "^18.2.15",
"@typescript-eslint/eslint-plugin": "^6.10.0",
"@typescript-eslint/parser": "^6.10.0",
"@vitejs/plugin-react": "^4.1.1",
"autoprefixer": "^10.4.16",
"eslint": "^8.53.0",
"eslint-plugin-react-hooks": "^4.6.0",
"eslint-plugin-react-refresh": "^0.4.4",
"jsdom": "^26.1.0",
"postcss": "^8.4.31",
"tailwindcss": "^3.3.5",
"typescript": "^5.2.2",
"vite": "^4.5.0",
"vitest": "^0.34.6"
}
}

119
frontend/src/App.tsx Normal file
View File

@@ -0,0 +1,119 @@
import React from 'react';
import { BrowserRouter as Router, Routes, Route, Navigate } from 'react-router-dom';
import { AuthProvider, useAuth } from './contexts/AuthContext';
import LoginForm from './components/LoginForm';
import ProtectedRoute from './components/ProtectedRoute';
import LogoutButton from './components/LogoutButton';
// Simple dashboard component for demonstration
const Dashboard: React.FC = () => {
const { user } = useAuth();
return (
<div className="min-h-screen bg-gray-50">
<nav className="bg-white shadow-sm border-b">
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
<div className="flex justify-between h-16">
<div className="flex items-center">
<h1 className="text-xl font-semibold text-gray-900">
CIM Document Processor
</h1>
</div>
<div className="flex items-center space-x-4">
<span className="text-sm text-gray-700">
Welcome, {user?.name || user?.email}
</span>
<LogoutButton variant="link" />
</div>
</div>
</div>
</nav>
<main className="max-w-7xl mx-auto py-6 sm:px-6 lg:px-8">
<div className="px-4 py-6 sm:px-0">
<div className="border-4 border-dashed border-gray-200 rounded-lg h-96 flex items-center justify-center">
<div className="text-center">
<h2 className="text-2xl font-medium text-gray-900 mb-4">
Dashboard
</h2>
<p className="text-gray-600">
Welcome to the CIM Document Processor dashboard.
</p>
<p className="text-sm text-gray-500 mt-2">
Role: {user?.role}
</p>
</div>
</div>
</div>
</main>
</div>
);
};
// Login page component
const LoginPage: React.FC = () => {
const { user } = useAuth();
// Redirect to dashboard if already authenticated
if (user) {
return <Navigate to="/dashboard" replace />;
}
return (
<div className="min-h-screen bg-gray-50 flex flex-col justify-center py-12 sm:px-6 lg:px-8">
<div className="sm:mx-auto sm:w-full sm:max-w-md">
<h2 className="mt-6 text-center text-3xl font-extrabold text-gray-900">
CIM Document Processor
</h2>
</div>
<div className="mt-8 sm:mx-auto sm:w-full sm:max-w-md">
<LoginForm />
</div>
</div>
);
};
// Unauthorized page component
const UnauthorizedPage: React.FC = () => {
return (
<div className="min-h-screen bg-gray-50 flex flex-col justify-center py-12 sm:px-6 lg:px-8">
<div className="sm:mx-auto sm:w-full sm:max-w-md">
<div className="bg-white py-8 px-4 shadow sm:rounded-lg sm:px-10">
<div className="text-center">
<h2 className="text-2xl font-bold text-gray-900 mb-4">
Access Denied
</h2>
<p className="text-gray-600 mb-6">
You don't have permission to access this resource.
</p>
<LogoutButton />
</div>
</div>
</div>
</div>
);
};
const App: React.FC = () => {
return (
<AuthProvider>
<Router>
<Routes>
<Route path="/login" element={<LoginPage />} />
<Route path="/unauthorized" element={<UnauthorizedPage />} />
<Route
path="/dashboard"
element={
<ProtectedRoute>
<Dashboard />
</ProtectedRoute>
}
/>
<Route path="/" element={<Navigate to="/dashboard" replace />} />
</Routes>
</Router>
</AuthProvider>
);
};
export default App;

View File

@@ -0,0 +1,168 @@
import React, { useState } from 'react';
import { useAuth } from '../contexts/AuthContext';
import { validateLoginForm } from '../utils/validation';
import { cn } from '../utils/cn';
import { Eye, EyeOff, LogIn } from 'lucide-react';
interface LoginFormProps {
onSuccess?: () => void;
}
export const LoginForm: React.FC<LoginFormProps> = ({ onSuccess }) => {
const { login, isLoading, error } = useAuth();
const [formData, setFormData] = useState({
email: '',
password: '',
});
const [formErrors, setFormErrors] = useState<{ email?: string; password?: string }>({});
const [showPassword, setShowPassword] = useState(false);
const handleInputChange = (e: React.ChangeEvent<HTMLInputElement>) => {
const { name, value } = e.target;
setFormData(prev => ({
...prev,
[name]: value,
}));
// Clear field-specific error when user starts typing
if (formErrors[name as keyof typeof formErrors]) {
setFormErrors(prev => ({
...prev,
[name]: undefined,
}));
}
};
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
// Clear previous errors
setFormErrors({});
// Validate form
const validation = validateLoginForm(formData.email, formData.password);
if (!validation.isValid) {
setFormErrors(validation.errors);
return;
}
try {
await login(formData);
onSuccess?.();
} catch (error) {
// Error is handled by the auth context
console.error('Login failed:', error);
}
};
return (
<div className="w-full max-w-md mx-auto">
<div className="bg-white shadow-lg rounded-lg p-8">
<div className="text-center mb-8">
<h1 className="text-2xl font-bold text-gray-900">Sign In</h1>
<p className="text-gray-600 mt-2">Access your CIM Document Processor</p>
</div>
<form onSubmit={handleSubmit} className="space-y-6">
{/* Email Field */}
<div>
<label htmlFor="email" className="block text-sm font-medium text-gray-700 mb-2">
Email Address
</label>
<input
id="email"
name="email"
type="email"
autoComplete="email"
required
value={formData.email}
onChange={handleInputChange}
className={cn(
"w-full px-3 py-2 border rounded-md shadow-sm placeholder-gray-400 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-500",
formErrors.email ? "border-red-300" : "border-gray-300"
)}
placeholder="Enter your email"
disabled={isLoading}
/>
{formErrors.email && (
<p className="mt-1 text-sm text-red-600">{formErrors.email}</p>
)}
</div>
{/* Password Field */}
<div>
<label htmlFor="password" className="block text-sm font-medium text-gray-700 mb-2">
Password
</label>
<div className="relative">
<input
id="password"
name="password"
type={showPassword ? 'text' : 'password'}
autoComplete="current-password"
required
value={formData.password}
onChange={handleInputChange}
className={cn(
"w-full px-3 py-2 pr-10 border rounded-md shadow-sm placeholder-gray-400 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-500",
formErrors.password ? "border-red-300" : "border-gray-300"
)}
placeholder="Enter your password"
disabled={isLoading}
/>
<button
type="button"
className="absolute inset-y-0 right-0 pr-3 flex items-center"
onClick={() => setShowPassword(!showPassword)}
disabled={isLoading}
>
{showPassword ? (
<EyeOff className="h-4 w-4 text-gray-400" />
) : (
<Eye className="h-4 w-4 text-gray-400" />
)}
</button>
</div>
{formErrors.password && (
<p className="mt-1 text-sm text-red-600">{formErrors.password}</p>
)}
</div>
{/* Global Error Message */}
{error && (
<div className="bg-red-50 border border-red-200 rounded-md p-3">
<p className="text-sm text-red-600">{error}</p>
</div>
)}
{/* Submit Button */}
<button
type="submit"
disabled={isLoading}
className={cn(
"w-full flex justify-center items-center py-2 px-4 border border-transparent rounded-md shadow-sm text-sm font-medium text-white",
"focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-blue-500",
isLoading
? "bg-gray-400 cursor-not-allowed"
: "bg-blue-600 hover:bg-blue-700"
)}
>
{isLoading ? (
<>
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-white mr-2"></div>
Signing in...
</>
) : (
<>
<LogIn className="h-4 w-4 mr-2" />
Sign In
</>
)}
</button>
</form>
</div>
</div>
);
};
export default LoginForm;

View File

@@ -0,0 +1,81 @@
import React, { useState } from 'react';
import { useAuth } from '../contexts/AuthContext';
import { cn } from '../utils/cn';
import { LogOut } from 'lucide-react';
interface LogoutButtonProps {
className?: string;
showConfirmation?: boolean;
variant?: 'button' | 'link';
}
export const LogoutButton: React.FC<LogoutButtonProps> = ({
className,
showConfirmation = true,
variant = 'button',
}) => {
const { logout, isLoading } = useAuth();
const [showConfirmDialog, setShowConfirmDialog] = useState(false);
const handleLogout = async () => {
if (showConfirmation && !showConfirmDialog) {
setShowConfirmDialog(true);
return;
}
try {
await logout();
setShowConfirmDialog(false);
} catch (error) {
console.error('Logout failed:', error);
}
};
const handleCancel = () => {
setShowConfirmDialog(false);
};
if (showConfirmDialog) {
return (
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
<div className="bg-white rounded-lg p-6 max-w-sm mx-4">
<h3 className="text-lg font-medium text-gray-900 mb-4">Confirm Logout</h3>
<p className="text-gray-600 mb-6">Are you sure you want to sign out?</p>
<div className="flex space-x-3">
<button
onClick={handleLogout}
disabled={isLoading}
className="flex-1 bg-red-600 text-white py-2 px-4 rounded-md hover:bg-red-700 focus:outline-none focus:ring-2 focus:ring-red-500 disabled:opacity-50"
>
{isLoading ? 'Signing out...' : 'Sign Out'}
</button>
<button
onClick={handleCancel}
disabled={isLoading}
className="flex-1 bg-gray-200 text-gray-800 py-2 px-4 rounded-md hover:bg-gray-300 focus:outline-none focus:ring-2 focus:ring-gray-500"
>
Cancel
</button>
</div>
</div>
</div>
);
}
const baseClasses = variant === 'button'
? "inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md text-white bg-red-600 hover:bg-red-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-red-500 disabled:opacity-50"
: "inline-flex items-center text-sm text-gray-700 hover:text-red-600 focus:outline-none focus:underline";
return (
<button
onClick={handleLogout}
disabled={isLoading}
className={cn(baseClasses, className)}
>
<LogOut className="h-4 w-4 mr-2" />
{isLoading ? 'Signing out...' : 'Sign Out'}
</button>
);
};
export default LogoutButton;

View File

@@ -0,0 +1,42 @@
import React, { ReactNode } from 'react';
import { Navigate, useLocation } from 'react-router-dom';
import { useAuth } from '../contexts/AuthContext';
interface ProtectedRouteProps {
children: ReactNode;
requiredRole?: 'user' | 'admin';
fallbackPath?: string;
}
export const ProtectedRoute: React.FC<ProtectedRouteProps> = ({
children,
requiredRole,
fallbackPath = '/login',
}) => {
const { user, isLoading, isInitialized } = useAuth();
const location = useLocation();
// Show loading spinner while checking authentication
if (isLoading || !isInitialized) {
return (
<div className="min-h-screen flex items-center justify-center">
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600"></div>
</div>
);
}
// Redirect to login if not authenticated
if (!user) {
return <Navigate to={fallbackPath} state={{ from: location }} replace />;
}
// Check role-based access if required
if (requiredRole && user.role !== requiredRole) {
// If user doesn't have required role, redirect to unauthorized page or dashboard
return <Navigate to="/unauthorized" replace />;
}
return <>{children}</>;
};
export default ProtectedRoute;

View File

@@ -0,0 +1,341 @@
import React from 'react';
import { render, screen, waitFor, fireEvent, act } from '@testing-library/react';
import userEvent from '@testing-library/user-event';
import { vi, describe, it, expect, beforeEach } from 'vitest';
import LoginForm from '../LoginForm';
import { AuthProvider } from '../../contexts/AuthContext';
import { authService } from '../../services/authService';
// Mock the auth service
vi.mock('../../services/authService', () => ({
authService: {
login: vi.fn(),
logout: vi.fn(),
getToken: vi.fn(),
getCurrentUser: vi.fn(),
validateToken: vi.fn(),
},
}));
const MockedAuthService = authService as any;
// Wrapper component for tests
const TestWrapper: React.FC<{ children: React.ReactNode }> = ({ children }) => (
<AuthProvider>{children}</AuthProvider>
);
// Helper to wait for auth initialization
const waitForAuthInit = async () => {
await waitFor(() => {
expect(screen.getByLabelText(/email address/i)).toBeInTheDocument();
}, { timeout: 5000 });
};
describe('LoginForm', () => {
const user = userEvent.setup();
beforeEach(() => {
vi.clearAllMocks();
// Set up default mocks to prevent async initialization issues
MockedAuthService.getToken.mockReturnValue(null);
MockedAuthService.getCurrentUser.mockReturnValue(null);
MockedAuthService.validateToken.mockResolvedValue(null);
});
it('renders login form with all required fields', async () => {
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
expect(screen.getByLabelText(/email address/i)).toBeInTheDocument();
expect(screen.getByLabelText(/password/i)).toBeInTheDocument();
expect(screen.getByRole('button', { name: /sign in/i })).toBeInTheDocument();
});
it('shows validation errors for empty fields', async () => {
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const form = screen.getByRole('button', { name: /sign in/i }).closest('form');
await act(async () => {
fireEvent.submit(form!);
});
await waitFor(() => {
expect(screen.getByText(/email is required/i)).toBeInTheDocument();
});
expect(screen.getByText(/password is required/i)).toBeInTheDocument();
});
it('shows validation error for invalid email format', async () => {
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const emailInput = screen.getByLabelText(/email address/i);
const passwordInput = screen.getByLabelText(/password/i);
const form = screen.getByRole('button', { name: /sign in/i }).closest('form');
await act(async () => {
await user.type(emailInput, 'invalid-email');
await user.type(passwordInput, 'password123');
});
await act(async () => {
fireEvent.submit(form!);
});
await waitFor(() => {
expect(screen.getByText(/please enter a valid email address/i)).toBeInTheDocument();
});
});
it('shows validation error for short password', async () => {
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const emailInput = screen.getByLabelText(/email address/i);
const passwordInput = screen.getByLabelText(/password/i);
const form = screen.getByRole('button', { name: /sign in/i }).closest('form');
await act(async () => {
await user.type(emailInput, 'test@example.com');
await user.type(passwordInput, '123');
});
await act(async () => {
fireEvent.submit(form!);
});
await waitFor(() => {
expect(screen.getByText(/password must be at least 6 characters long/i)).toBeInTheDocument();
});
});
it('toggles password visibility', async () => {
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const passwordInput = screen.getByLabelText(/password/i) as HTMLInputElement;
const toggleButtons = screen.getAllByRole('button');
const toggleButton = toggleButtons.find(button => button.getAttribute('type') === 'button' && !button.textContent?.includes('Sign'));
expect(passwordInput.type).toBe('password');
if (toggleButton) {
await act(async () => {
await user.click(toggleButton);
});
expect(passwordInput.type).toBe('text');
await act(async () => {
await user.click(toggleButton);
});
expect(passwordInput.type).toBe('password');
}
});
it('clears field errors when user starts typing', async () => {
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const emailInput = screen.getByLabelText(/email address/i);
const form = screen.getByRole('button', { name: /sign in/i }).closest('form');
// Trigger validation error
await act(async () => {
fireEvent.submit(form!);
});
await waitFor(() => {
expect(screen.getByText(/email is required/i)).toBeInTheDocument();
});
// Start typing to clear error
await act(async () => {
await user.type(emailInput, 'test@example.com');
});
await waitFor(() => {
expect(screen.queryByText(/email is required/i)).not.toBeInTheDocument();
});
});
it('calls login service with correct credentials', async () => {
const mockAuthResult = {
user: { id: '1', email: 'test@example.com', name: 'Test User', role: 'user' as const, createdAt: '2023-01-01', updatedAt: '2023-01-01' },
token: 'mock-token',
refreshToken: 'mock-refresh-token',
};
MockedAuthService.login.mockResolvedValue(mockAuthResult);
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const emailInput = screen.getByLabelText(/email address/i);
const passwordInput = screen.getByLabelText(/password/i);
const form = screen.getByRole('button', { name: /sign in/i }).closest('form');
await act(async () => {
await user.type(emailInput, 'test@example.com');
await user.type(passwordInput, 'password123');
});
await act(async () => {
fireEvent.submit(form!);
});
await waitFor(() => {
expect(MockedAuthService.login).toHaveBeenCalledWith({
email: 'test@example.com',
password: 'password123',
});
});
});
it('shows loading state during login', async () => {
MockedAuthService.login.mockImplementation(() => new Promise(resolve => setTimeout(resolve, 100)));
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const emailInput = screen.getByLabelText(/email address/i);
const passwordInput = screen.getByLabelText(/password/i);
const submitButton = screen.getByRole('button', { name: /sign in/i });
await act(async () => {
await user.type(emailInput, 'test@example.com');
await user.type(passwordInput, 'password123');
await user.click(submitButton);
});
expect(screen.getByText(/signing in.../i)).toBeInTheDocument();
expect(submitButton).toBeDisabled();
});
it('shows error message when login fails', async () => {
MockedAuthService.login.mockRejectedValue(new Error('Invalid credentials'));
await act(async () => {
render(
<TestWrapper>
<LoginForm />
</TestWrapper>
);
});
await waitForAuthInit();
const emailInput = screen.getByLabelText(/email address/i);
const passwordInput = screen.getByLabelText(/password/i);
const form = screen.getByRole('button', { name: /sign in/i }).closest('form');
await act(async () => {
await user.type(emailInput, 'test@example.com');
await user.type(passwordInput, 'wrongpassword');
});
await act(async () => {
fireEvent.submit(form!);
});
await waitFor(() => {
expect(screen.getByText(/invalid credentials/i)).toBeInTheDocument();
});
});
it('calls onSuccess callback when login succeeds', async () => {
const mockOnSuccess = vi.fn();
const mockAuthResult = {
user: { id: '1', email: 'test@example.com', name: 'Test User', role: 'user' as const, createdAt: '2023-01-01', updatedAt: '2023-01-01' },
token: 'mock-token',
refreshToken: 'mock-refresh-token',
};
MockedAuthService.login.mockResolvedValue(mockAuthResult);
await act(async () => {
render(
<TestWrapper>
<LoginForm onSuccess={mockOnSuccess} />
</TestWrapper>
);
});
await waitForAuthInit();
const emailInput = screen.getByLabelText(/email address/i);
const passwordInput = screen.getByLabelText(/password/i);
const form = screen.getByRole('button', { name: /sign in/i }).closest('form');
await act(async () => {
await user.type(emailInput, 'test@example.com');
await user.type(passwordInput, 'password123');
});
await act(async () => {
fireEvent.submit(form!);
});
await waitFor(() => {
expect(mockOnSuccess).toHaveBeenCalled();
});
});
});

View File

@@ -0,0 +1,269 @@
import React from 'react';
import { render, screen, waitFor, act } from '@testing-library/react';
import userEvent from '@testing-library/user-event';
import { vi, describe, it, expect, beforeEach } from 'vitest';
import LogoutButton from '../LogoutButton';
import { AuthProvider } from '../../contexts/AuthContext';
import { authService } from '../../services/authService';
// Mock the auth service
vi.mock('../../services/authService', () => ({
authService: {
login: vi.fn(),
logout: vi.fn(),
getToken: vi.fn(),
getCurrentUser: vi.fn(),
validateToken: vi.fn(),
},
}));
const MockedAuthService = authService as any;
// Wrapper component for tests
const TestWrapper: React.FC<{ children: React.ReactNode }> = ({ children }) => (
<AuthProvider>{children}</AuthProvider>
);
// Helper to wait for auth initialization
const waitForAuthInit = async () => {
await waitFor(() => {
expect(screen.getByRole('button', { name: /sign out/i })).toBeInTheDocument();
}, { timeout: 5000 });
};
describe('LogoutButton', () => {
const user = userEvent.setup();
beforeEach(() => {
vi.clearAllMocks();
MockedAuthService.getToken.mockReturnValue('mock-token');
MockedAuthService.getCurrentUser.mockReturnValue({
id: '1',
email: 'test@example.com',
name: 'Test User',
role: 'user',
});
MockedAuthService.validateToken.mockResolvedValue({
id: '1',
email: 'test@example.com',
name: 'Test User',
role: 'user',
});
MockedAuthService.logout.mockResolvedValue(undefined);
});
it('renders logout button with default variant', async () => {
await act(async () => {
render(
<TestWrapper>
<LogoutButton />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
expect(button).toBeInTheDocument();
expect(button).toHaveClass('bg-red-600'); // Button variant styling
});
it('renders logout link with link variant', async () => {
await act(async () => {
render(
<TestWrapper>
<LogoutButton variant="link" />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
expect(button).toBeInTheDocument();
expect(button).not.toHaveClass('bg-red-600'); // Link variant styling
});
it('shows confirmation dialog when showConfirmation is true', async () => {
await act(async () => {
render(
<TestWrapper>
<LogoutButton showConfirmation={true} />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
await act(async () => {
await user.click(button);
});
await waitFor(() => {
expect(screen.getByText(/confirm logout/i)).toBeInTheDocument();
});
expect(screen.getByText(/are you sure you want to sign out/i)).toBeInTheDocument();
expect(screen.getByRole('button', { name: /cancel/i })).toBeInTheDocument();
});
it('does not show confirmation dialog when showConfirmation is false', async () => {
await act(async () => {
render(
<TestWrapper>
<LogoutButton showConfirmation={false} />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
await act(async () => {
await user.click(button);
});
// Should not show confirmation dialog, should call logout directly
await waitFor(() => {
expect(MockedAuthService.logout).toHaveBeenCalled();
});
});
it('calls logout service when confirmed', async () => {
// Ensure the mock is properly set up
MockedAuthService.logout.mockResolvedValue(undefined);
await act(async () => {
render(
<TestWrapper>
<LogoutButton showConfirmation={true} />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
await act(async () => {
await user.click(button);
});
await waitFor(() => {
expect(screen.getByText(/confirm logout/i)).toBeInTheDocument();
});
// In the confirmation dialog, there's only one "Sign Out" button
const confirmButton = screen.getByRole('button', { name: /sign out/i });
await act(async () => {
await user.click(confirmButton);
});
// Wait for the logout to be called
await waitFor(() => {
expect(MockedAuthService.logout).toHaveBeenCalled();
}, { timeout: 3000 });
});
it('cancels logout when cancel button is clicked', async () => {
await act(async () => {
render(
<TestWrapper>
<LogoutButton showConfirmation={true} />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
await act(async () => {
await user.click(button);
});
await waitFor(() => {
expect(screen.getByText(/confirm logout/i)).toBeInTheDocument();
});
const cancelButton = screen.getByRole('button', { name: /cancel/i });
await act(async () => {
await user.click(cancelButton);
});
await waitFor(() => {
expect(screen.queryByText(/confirm logout/i)).not.toBeInTheDocument();
});
expect(MockedAuthService.logout).not.toHaveBeenCalled();
});
it('shows loading state during logout', async () => {
// Mock logout to be slow so we can see loading state
MockedAuthService.logout.mockImplementation(() => new Promise(resolve => setTimeout(resolve, 100)));
await act(async () => {
render(
<TestWrapper>
<LogoutButton showConfirmation={false} />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
await act(async () => {
await user.click(button);
});
// Should show loading state immediately
await waitFor(() => {
expect(screen.getByText(/signing out.../i)).toBeInTheDocument();
});
const loadingButton = screen.getByText(/signing out.../i).closest('button');
expect(loadingButton).toBeDisabled();
});
it('handles logout errors gracefully', async () => {
const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {});
MockedAuthService.logout.mockRejectedValue(new Error('Logout failed'));
await act(async () => {
render(
<TestWrapper>
<LogoutButton showConfirmation={false} />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
await act(async () => {
await user.click(button);
});
// The error is logged in AuthContext, not directly in the component
await waitFor(() => {
expect(consoleSpy).toHaveBeenCalledWith('Logout error:', expect.any(Error));
});
consoleSpy.mockRestore();
});
it('applies custom className', async () => {
await act(async () => {
render(
<TestWrapper>
<LogoutButton className="custom-class" />
</TestWrapper>
);
});
await waitForAuthInit();
const button = screen.getByRole('button', { name: /sign out/i });
expect(button).toHaveClass('custom-class');
});
});

View File

@@ -0,0 +1,132 @@
import React from 'react';
import { render, screen } from '@testing-library/react';
import { MemoryRouter, Routes, Route } from 'react-router-dom';
import { vi, describe, it, expect, beforeEach } from 'vitest';
import ProtectedRoute from '../ProtectedRoute';
// Mock the useAuth hook to control its output in tests
const mockUseAuth = vi.fn();
vi.mock('../../contexts/AuthContext', () => ({
useAuth: () => mockUseAuth(),
}));
const TestComponent: React.FC = () => <div>Protected Content</div>;
const LoginComponent: React.FC = () => <div>Login Page</div>;
const UnauthorizedComponent: React.FC = () => <div>Unauthorized Page</div>;
const renderWithRouter = (ui: React.ReactNode, { initialEntries = ['/protected'] } = {}) => {
return render(
<MemoryRouter initialEntries={initialEntries}>
<Routes>
<Route path="/login" element={<LoginComponent />} />
<Route path="/unauthorized" element={<UnauthorizedComponent />} />
<Route path="/protected" element={ui} />
</Routes>
</MemoryRouter>
);
};
describe('ProtectedRoute', () => {
beforeEach(() => {
vi.resetAllMocks();
});
it('shows a loading spinner while authentication is in progress', () => {
mockUseAuth.mockReturnValue({
user: null,
isLoading: true,
isInitialized: false,
});
renderWithRouter(
<ProtectedRoute>
<TestComponent />
</ProtectedRoute>
);
expect(document.querySelector('.animate-spin')).toBeInTheDocument();
expect(screen.queryByText('Protected Content')).not.toBeInTheDocument();
});
it('redirects to the login page if the user is not authenticated', () => {
mockUseAuth.mockReturnValue({
user: null,
isLoading: false,
isInitialized: true,
});
renderWithRouter(
<ProtectedRoute>
<TestComponent />
</ProtectedRoute>
);
expect(screen.getByText('Login Page')).toBeInTheDocument();
expect(screen.queryByText('Protected Content')).not.toBeInTheDocument();
});
it('renders the protected content if the user is authenticated', () => {
mockUseAuth.mockReturnValue({
user: { id: '1', role: 'user' },
isLoading: false,
isInitialized: true,
});
renderWithRouter(
<ProtectedRoute>
<TestComponent />
</ProtectedRoute>
);
expect(screen.getByText('Protected Content')).toBeInTheDocument();
});
it('redirects to an unauthorized page if the user does not have the required role', () => {
mockUseAuth.mockReturnValue({
user: { id: '1', role: 'user' },
isLoading: false,
isInitialized: true,
});
renderWithRouter(
<ProtectedRoute requiredRole="admin">
<TestComponent />
</ProtectedRoute>
);
expect(screen.getByText('Unauthorized Page')).toBeInTheDocument();
expect(screen.queryByText('Protected Content')).not.toBeInTheDocument();
});
it('renders the protected content if the user has the required admin role', () => {
mockUseAuth.mockReturnValue({
user: { id: '1', role: 'admin' },
isLoading: false,
isInitialized: true,
});
renderWithRouter(
<ProtectedRoute requiredRole="admin">
<TestComponent />
</ProtectedRoute>
);
expect(screen.getByText('Protected Content')).toBeInTheDocument();
});
it('renders the protected content if the user has the required user role', () => {
mockUseAuth.mockReturnValue({
user: { id: '1', role: 'user' },
isLoading: false,
isInitialized: true,
});
renderWithRouter(
<ProtectedRoute requiredRole="user">
<TestComponent />
</ProtectedRoute>
);
expect(screen.getByText('Protected Content')).toBeInTheDocument();
});
});

View File

@@ -0,0 +1,18 @@
// Frontend environment configuration
export const config = {
apiBaseUrl: import.meta.env.VITE_API_BASE_URL || 'http://localhost:5000/api',
appName: import.meta.env.VITE_APP_NAME || 'CIM Document Processor',
maxFileSize: parseInt(import.meta.env.VITE_MAX_FILE_SIZE || '104857600'), // 100MB
allowedFileTypes: (import.meta.env.VITE_ALLOWED_FILE_TYPES || 'application/pdf').split(','),
};
// Validate required environment variables
const requiredEnvVars = ['VITE_API_BASE_URL'];
for (const envVar of requiredEnvVars) {
if (!import.meta.env[envVar]) {
console.warn(`Warning: ${envVar} is not set in environment variables`);
}
}
export default config;

View File

@@ -0,0 +1,105 @@
import React, { createContext, useContext, useEffect, useState, ReactNode } from 'react';
import { User, LoginCredentials, AuthContextType } from '../types/auth';
import { authService } from '../services/authService';
const AuthContext = createContext<AuthContextType | undefined>(undefined);
interface AuthProviderProps {
children: ReactNode;
}
export const AuthProvider: React.FC<AuthProviderProps> = ({ children }) => {
const [user, setUser] = useState<User | null>(null);
const [token, setToken] = useState<string | null>(null);
const [isLoading, setIsLoading] = useState(true);
const [error, setError] = useState<string | null>(null);
const [isInitialized, setIsInitialized] = useState(false);
useEffect(() => {
// Initialize auth state from localStorage and validate token
const initializeAuth = async () => {
try {
const storedToken = authService.getToken();
const storedUser = authService.getCurrentUser();
if (storedToken && storedUser) {
// Validate token with backend
const validatedUser = await authService.validateToken();
if (validatedUser) {
setUser(validatedUser);
setToken(storedToken);
} else {
// Token is invalid, clear everything
setUser(null);
setToken(null);
}
}
} catch (error) {
console.error('Auth initialization error:', error);
setError('Failed to initialize authentication');
setUser(null);
setToken(null);
} finally {
setIsLoading(false);
setIsInitialized(true);
}
};
initializeAuth();
}, []);
const login = async (credentials: LoginCredentials): Promise<void> => {
setIsLoading(true);
setError(null);
try {
const authResult = await authService.login(credentials);
setUser(authResult.user);
setToken(authResult.token);
} catch (error) {
const errorMessage = error instanceof Error ? error.message : 'Login failed';
setError(errorMessage);
throw error;
} finally {
setIsLoading(false);
}
};
const logout = async (): Promise<void> => {
setIsLoading(true);
setError(null);
try {
await authService.logout();
} catch (error) {
console.error('Logout error:', error);
// Continue with logout even if API call fails
} finally {
setUser(null);
setToken(null);
setIsLoading(false);
}
};
const value: AuthContextType = {
user,
token,
login,
logout,
isLoading,
error,
isInitialized,
};
return <AuthContext.Provider value={value}>{children}</AuthContext.Provider>;
};
export const useAuth = (): AuthContextType => {
const context = useContext(AuthContext);
if (context === undefined) {
throw new Error('useAuth must be used within an AuthProvider');
}
return context;
};
export default AuthContext;

3
frontend/src/index.css Normal file
View File

@@ -0,0 +1,3 @@
@tailwind base;
@tailwind components;
@tailwind utilities;

10
frontend/src/main.tsx Normal file
View File

@@ -0,0 +1,10 @@
import React from 'react';
import ReactDOM from 'react-dom/client';
import App from './App';
import './index.css';
ReactDOM.createRoot(document.getElementById('root')!).render(
<React.StrictMode>
<App />
</React.StrictMode>
);

View File

@@ -0,0 +1,103 @@
import axios from 'axios';
import { config } from '../config/env';
import { LoginCredentials, AuthResult, User } from '../types/auth';
const API_BASE_URL = config.apiBaseUrl;
class AuthService {
private token: string | null = null;
constructor() {
// Initialize token from localStorage
this.token = localStorage.getItem('auth_token');
if (this.token) {
this.setAuthHeader(this.token);
}
}
private setAuthHeader(token: string) {
axios.defaults.headers.common['Authorization'] = `Bearer ${token}`;
}
private removeAuthHeader() {
delete axios.defaults.headers.common['Authorization'];
}
async login(credentials: LoginCredentials): Promise<AuthResult> {
try {
const response = await axios.post(`${API_BASE_URL}/auth/login`, credentials);
const authResult: AuthResult = response.data;
// Store token and set auth header
this.token = authResult.token;
localStorage.setItem('auth_token', authResult.token);
localStorage.setItem('refresh_token', authResult.refreshToken);
localStorage.setItem('user', JSON.stringify(authResult.user));
this.setAuthHeader(authResult.token);
return authResult;
} catch (error) {
if (axios.isAxiosError(error)) {
throw new Error(error.response?.data?.message || 'Login failed');
}
throw new Error('An unexpected error occurred');
}
}
async logout(): Promise<void> {
try {
if (this.token) {
await axios.post(`${API_BASE_URL}/auth/logout`);
}
} catch (error) {
// Continue with logout even if API call fails
console.error('Logout API call failed:', error);
} finally {
// Clear local storage and auth header
this.token = null;
localStorage.removeItem('auth_token');
localStorage.removeItem('refresh_token');
localStorage.removeItem('user');
this.removeAuthHeader();
}
}
async validateToken(): Promise<User | null> {
if (!this.token) {
return null;
}
try {
const response = await axios.get(`${API_BASE_URL}/auth/validate`);
return response.data.user;
} catch (error) {
// Token is invalid, clear it
this.logout();
return null;
}
}
getCurrentUser(): User | null {
const userStr = localStorage.getItem('user');
if (userStr) {
try {
return JSON.parse(userStr);
} catch {
return null;
}
}
return null;
}
getToken(): string | null {
return this.token;
}
isAuthenticated(): boolean {
return !!this.token;
}
}
export const authService = new AuthService();
export default authService;

View File

@@ -0,0 +1,55 @@
import '@testing-library/jest-dom';
import { vi } from 'vitest';
import { act } from '@testing-library/react';
// Mock localStorage
const localStorageMock = {
getItem: vi.fn(),
setItem: vi.fn(),
removeItem: vi.fn(),
clear: vi.fn(),
};
Object.defineProperty(window, 'localStorage', {
value: localStorageMock,
});
// Mock window.location
Object.defineProperty(window, 'location', {
value: {
href: 'http://localhost:3000',
origin: 'http://localhost:3000',
pathname: '/',
search: '',
hash: '',
},
writable: true,
});
// Mock console.error to prevent noise in tests
const originalConsoleError = console.error;
beforeAll(() => {
console.error = vi.fn();
});
afterAll(() => {
console.error = originalConsoleError;
});
// Reset mocks before each test
beforeEach(() => {
localStorageMock.getItem.mockClear();
localStorageMock.setItem.mockClear();
localStorageMock.removeItem.mockClear();
localStorageMock.clear.mockClear();
// Clear all mocks
vi.clearAllMocks();
});
// Helper to wait for async operations to complete
export const waitForAsync = async () => {
await act(async () => {
await new Promise(resolve => setTimeout(resolve, 0));
});
};

View File

@@ -0,0 +1,29 @@
export interface User {
id: string;
email: string;
name: string;
role: 'user' | 'admin';
createdAt: string;
updatedAt: string;
}
export interface LoginCredentials {
email: string;
password: string;
}
export interface AuthResult {
user: User;
token: string;
refreshToken: string;
}
export interface AuthContextType {
user: User | null;
token: string | null;
login: (credentials: LoginCredentials) => Promise<void>;
logout: () => Promise<void>;
isLoading: boolean;
error: string | null;
isInitialized: boolean;
}

6
frontend/src/utils/cn.ts Normal file
View File

@@ -0,0 +1,6 @@
import { clsx, type ClassValue } from 'clsx';
import { twMerge } from 'tailwind-merge';
export function cn(...inputs: ClassValue[]) {
return twMerge(clsx(inputs));
}

View File

@@ -0,0 +1,43 @@
export const validateEmail = (email: string): string | null => {
if (!email) {
return 'Email is required';
}
const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/;
if (!emailRegex.test(email)) {
return 'Please enter a valid email address';
}
return null;
};
export const validatePassword = (password: string): string | null => {
if (!password) {
return 'Password is required';
}
if (password.length < 6) {
return 'Password must be at least 6 characters long';
}
return null;
};
export const validateLoginForm = (email: string, password: string) => {
const errors: { email?: string; password?: string } = {};
const emailError = validateEmail(email);
if (emailError) {
errors.email = emailError;
}
const passwordError = validatePassword(password);
if (passwordError) {
errors.password = passwordError;
}
return {
errors,
isValid: Object.keys(errors).length === 0,
};
};

View File

@@ -0,0 +1,32 @@
/** @type {import('tailwindcss').Config} */
export default {
content: [
"./index.html",
"./src/**/*.{js,ts,jsx,tsx}",
],
theme: {
extend: {
colors: {
primary: {
50: '#eff6ff',
500: '#3b82f6',
600: '#2563eb',
700: '#1d4ed8',
},
gray: {
50: '#f9fafb',
100: '#f3f4f6',
200: '#e5e7eb',
300: '#d1d5db',
400: '#9ca3af',
500: '#6b7280',
600: '#4b5563',
700: '#374151',
800: '#1f2937',
900: '#111827',
},
},
},
},
plugins: [],
}

35
frontend/tsconfig.json Normal file
View File

@@ -0,0 +1,35 @@
{
"compilerOptions": {
"target": "ES2020",
"useDefineForClassFields": true,
"lib": ["ES2020", "DOM", "DOM.Iterable"],
"module": "ESNext",
"skipLibCheck": true,
"esModuleInterop": true,
/* Bundler mode */
"moduleResolution": "bundler",
"allowImportingTsExtensions": true,
"resolveJsonModule": true,
"isolatedModules": true,
"noEmit": true,
"jsx": "react-jsx",
/* Linting */
"strict": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"noFallthroughCasesInSwitch": true,
/* Path mapping */
"baseUrl": ".",
"paths": {
"@/*": ["./src/*"]
},
/* Types */
"types": ["vite/client", "vitest/globals"]
},
"include": ["src"],
"references": [{ "path": "./tsconfig.node.json" }]
}

View File

@@ -0,0 +1,10 @@
{
"compilerOptions": {
"composite": true,
"skipLibCheck": true,
"module": "ESNext",
"moduleResolution": "bundler",
"allowSyntheticDefaultImports": true
},
"include": ["vite.config.ts"]
}

0
frontend/verify-auth.js Normal file
View File

27
frontend/vite.config.ts Normal file
View File

@@ -0,0 +1,27 @@
import { defineConfig } from 'vite'
import react from '@vitejs/plugin-react'
import path from 'path'
// https://vitejs.dev/config/
export default defineConfig({
plugins: [react()],
resolve: {
alias: {
'@': path.resolve(__dirname, './src'),
},
},
server: {
port: 3000,
proxy: {
'/api': {
target: 'http://localhost:5000',
changeOrigin: true,
},
},
},
test: {
globals: true,
environment: 'jsdom',
setupFiles: ['./src/test/setup.ts'],
},
})