Files
HomeAudit/scripts/validate_vaultwarden_migration.sh
admin 705a2757c1 Major infrastructure migration and Vaultwarden PostgreSQL troubleshooting
COMPREHENSIVE CHANGES:

INFRASTRUCTURE MIGRATION:
- Migrated services to Docker Swarm on OMV800 (192.168.50.229)
- Deployed PostgreSQL database for Vaultwarden migration
- Updated all stack configurations for Docker Swarm compatibility
- Added comprehensive monitoring stack (Prometheus, Grafana, Blackbox)
- Implemented proper secret management for all services

VAULTWARDEN POSTGRESQL MIGRATION:
- Attempted migration from SQLite to PostgreSQL for NFS compatibility
- Created PostgreSQL stack with proper user/password configuration
- Built custom Vaultwarden image with PostgreSQL support
- Troubleshot persistent SQLite fallback issue despite PostgreSQL config
- Identified known issue where Vaultwarden silently falls back to SQLite
- Added ENABLE_DB_WAL=false to prevent filesystem compatibility issues
- Current status: Old Vaultwarden on lenovo410 still working, new one has config issues

PAPERLESS SERVICES:
- Successfully deployed Paperless-NGX and Paperless-AI on OMV800
- Both services running on ports 8000 and 3000 respectively
- Caddy configuration updated for external access
- Services accessible via paperless.pressmess.duckdns.org and paperless-ai.pressmess.duckdns.org

CADDY CONFIGURATION:
- Updated Caddyfile on Surface (192.168.50.254) for new service locations
- Fixed Vaultwarden reverse proxy to point to new Docker Swarm service
- Removed old notification hub reference that was causing conflicts
- All services properly configured for external access via DuckDNS

BACKUP AND DISCOVERY:
- Created comprehensive backup system for all hosts
- Generated detailed discovery reports for infrastructure analysis
- Implemented automated backup validation scripts
- Created migration progress tracking and verification reports

MONITORING STACK:
- Deployed Prometheus, Grafana, and Blackbox monitoring
- Created infrastructure and system overview dashboards
- Added proper service discovery and alerting configuration
- Implemented performance monitoring for all critical services

DOCUMENTATION:
- Reorganized documentation into logical structure
- Created comprehensive migration playbook and troubleshooting guides
- Added hardware specifications and optimization recommendations
- Documented all configuration changes and service dependencies

CURRENT STATUS:
- Paperless services:  Working and accessible externally
- Vaultwarden:  PostgreSQL configuration issues, old instance still working
- Monitoring:  Deployed and operational
- Caddy:  Updated and working for external access
- PostgreSQL:  Database running, connection issues with Vaultwarden

NEXT STEPS:
- Continue troubleshooting Vaultwarden PostgreSQL configuration
- Consider alternative approaches for Vaultwarden migration
- Validate all external service access
- Complete final migration validation

TECHNICAL NOTES:
- Used Docker Swarm for orchestration on OMV800
- Implemented proper secret management for sensitive data
- Added comprehensive logging and monitoring
- Created automated backup and validation scripts
2025-08-30 20:18:44 -04:00

310 lines
9.0 KiB
Bash
Executable File

#!/bin/bash
# Vaultwarden Migration Pre-Validation Script
# Ensures 100% backup coverage and validates all prerequisites
set -euo pipefail
# Configuration
SOURCE_HOST="jonathan@192.168.50.181"
SOURCE_PATH="/home/jonathan/vaultwarden/data"
BACKUP_DIR="./backups/vaultwarden"
TARGET_PATH="/export/vaultwarden"
LOG_FILE="./logs/vaultwarden_validation.log"
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# Logging function
log() {
echo -e "${BLUE}[$(date +'%Y-%m-%d %H:%M:%S')]${NC} $1" | tee -a "$LOG_FILE"
}
log_success() {
echo -e "${GREEN}[$(date +'%Y-%m-%d %H:%M:%S')] SUCCESS:${NC} $1" | tee -a "$LOG_FILE"
}
log_warning() {
echo -e "${YELLOW}[$(date +'%Y-%m-%d %H:%M:%S')] WARNING:${NC} $1" | tee -a "$LOG_FILE"
}
log_error() {
echo -e "${RED}[$(date +'%Y-%m-%d %H:%M:%S')] ERROR:${NC} $1" | tee -a "$LOG_FILE"
}
# Create necessary directories
mkdir -p "$BACKUP_DIR"
mkdir -p "$(dirname "$LOG_FILE")"
log "Starting Vaultwarden migration pre-validation"
# Validation counters
PASSED=0
FAILED=0
WARNINGS=0
# Function to increment counters
increment_passed() {
((PASSED++))
log_success "$1"
}
increment_failed() {
((FAILED++))
log_error "$1"
}
increment_warning() {
((WARNINGS++))
log_warning "$1"
}
# Step 1: Verify SSH connectivity to source host
log "Step 1: Verifying SSH connectivity to source host"
if ssh -o ConnectTimeout=10 "$SOURCE_HOST" "echo 'SSH connection successful'" 2>/dev/null; then
increment_passed "SSH connectivity to $SOURCE_HOST verified"
else
increment_failed "Cannot establish SSH connection to $SOURCE_HOST"
exit 1
fi
# Step 2: Verify source Vaultwarden container is running
log "Step 2: Verifying source Vaultwarden container status"
if ssh "$SOURCE_HOST" "docker ps | grep -q vaultwarden"; then
increment_passed "Vaultwarden container is running on $SOURCE_HOST"
else
increment_failed "Vaultwarden container is not running on $SOURCE_HOST"
exit 1
fi
# Step 3: Verify source Vaultwarden container is healthy
log "Step 3: Verifying source Vaultwarden container health"
if ssh "$SOURCE_HOST" "docker ps | grep -q vaultwarden.*healthy"; then
increment_passed "Vaultwarden container is healthy"
else
increment_warning "Vaultwarden container is not showing as healthy (may still be functional)"
fi
# Step 4: Verify source data directory exists and has content
log "Step 4: Verifying source data directory"
if ssh "$SOURCE_HOST" "[ -d '$SOURCE_PATH' ]"; then
increment_passed "Source data directory exists"
# Check for critical files
if ssh "$SOURCE_HOST" "[ -f '$SOURCE_PATH/db.sqlite3' ]"; then
increment_passed "SQLite database file exists"
else
increment_failed "SQLite database file not found"
exit 1
fi
if ssh "$SOURCE_HOST" "[ -f '$SOURCE_PATH/rsa_key.pem' ]"; then
increment_passed "RSA key file exists"
else
increment_failed "RSA key file not found"
exit 1
fi
# Check directory contents
FILE_COUNT=$(ssh "$SOURCE_HOST" "find '$SOURCE_PATH' -type f | wc -l")
log "Source directory contains $FILE_COUNT files"
if [ "$FILE_COUNT" -gt 5 ]; then
increment_passed "Source directory has sufficient content"
else
increment_warning "Source directory seems to have few files ($FILE_COUNT)"
fi
else
increment_failed "Source data directory does not exist"
exit 1
fi
# Step 5: Create comprehensive backup with verification
log "Step 5: Creating comprehensive backup with verification"
BACKUP_FILE="$BACKUP_DIR/vaultwarden_pre_migration_backup_$(date +%Y%m%d_%H%M%S).tar.gz"
# Get container ID
CONTAINER_ID=$(ssh "$SOURCE_HOST" "docker ps | grep vaultwarden | awk '{print \$1}'")
log "Found Vaultwarden container: $CONTAINER_ID"
# Create backup
log "Creating backup archive"
ssh "$SOURCE_HOST" "tar czf - -C $SOURCE_PATH ." > "$BACKUP_FILE"
# Verify backup was created
if [ -f "$BACKUP_FILE" ]; then
increment_passed "Backup file created successfully"
else
increment_failed "Failed to create backup file"
exit 1
fi
# Verify backup size
BACKUP_SIZE=$(stat -c%s "$BACKUP_FILE")
log "Backup size: ${BACKUP_SIZE} bytes"
if [ "$BACKUP_SIZE" -gt 1000000 ]; then
increment_passed "Backup size is reasonable (${BACKUP_SIZE} bytes)"
else
increment_warning "Backup seems small (${BACKUP_SIZE} bytes)"
fi
# Verify backup contents
log "Verifying backup contents"
BACKUP_CONTENTS=$(tar tzf "$BACKUP_FILE" | wc -l)
log "Backup contains $BACKUP_CONTENTS files"
if [ "$BACKUP_CONTENTS" -gt 5 ]; then
increment_passed "Backup contains expected number of files"
else
increment_warning "Backup contains fewer files than expected"
fi
# Check for critical files in backup
if tar tzf "$BACKUP_FILE" | grep -q "db.sqlite3"; then
increment_passed "SQLite database included in backup"
else
increment_failed "SQLite database not found in backup"
exit 1
fi
if tar tzf "$BACKUP_FILE" | grep -q "rsa_key.pem"; then
increment_passed "RSA key included in backup"
else
increment_failed "RSA key not found in backup"
exit 1
fi
# Step 6: Create secondary backup to different location
log "Step 6: Creating secondary backup"
SECONDARY_BACKUP="/tmp/vaultwarden_emergency_backup_$(date +%Y%m%d_%H%M%S).tar.gz"
cp "$BACKUP_FILE" "$SECONDARY_BACKUP"
if [ -f "$SECONDARY_BACKUP" ]; then
increment_passed "Secondary backup created at $SECONDARY_BACKUP"
else
increment_failed "Failed to create secondary backup"
exit 1
fi
# Step 7: Verify NFS export accessibility
log "Step 7: Verifying NFS export accessibility"
if [ ! -d "$TARGET_PATH" ]; then
increment_failed "Target NFS path $TARGET_PATH does not exist"
log "Please ensure the NFS export is properly configured on OMV800"
exit 1
else
increment_passed "Target NFS path exists"
fi
# Test write access
if touch "$TARGET_PATH/test_write_access" 2>/dev/null; then
increment_passed "Write access to target NFS path verified"
rm -f "$TARGET_PATH/test_write_access"
else
increment_failed "Cannot write to target NFS path $TARGET_PATH"
exit 1
fi
# Step 8: Verify Docker Swarm prerequisites
log "Step 8: Verifying Docker Swarm prerequisites"
# Check if we're on a swarm manager
if docker node ls >/dev/null 2>&1; then
increment_passed "Docker Swarm manager access verified"
else
increment_failed "Not on a Docker Swarm manager node"
exit 1
fi
# Check for required secrets
if docker secret ls | grep -q smtp_user; then
increment_passed "SMTP user secret exists"
else
increment_warning "SMTP user secret not found (will be created if needed)"
fi
if docker secret ls | grep -q smtp_pass; then
increment_passed "SMTP password secret exists"
else
increment_warning "SMTP password secret not found (will be created if needed)"
fi
# Step 9: Verify network connectivity
log "Step 9: Verifying network connectivity"
# Check if caddy-public network exists
if docker network ls | grep -q caddy-public; then
increment_passed "caddy-public network exists"
else
increment_failed "caddy-public network not found"
exit 1
fi
# Step 10: Verify stack file syntax
log "Step 10: Verifying stack file syntax"
if docker-compose -f stacks/apps/vaultwarden.yml config >/dev/null 2>&1; then
increment_passed "Vaultwarden stack file syntax is valid"
else
increment_failed "Vaultwarden stack file has syntax errors"
exit 1
fi
# Step 11: Check disk space
log "Step 11: Checking disk space"
# Check backup directory space
BACKUP_DIR_SPACE=$(df "$BACKUP_DIR" | tail -1 | awk '{print $4}')
if [ "$BACKUP_DIR_SPACE" -gt 1000000 ]; then
increment_passed "Sufficient space in backup directory"
else
increment_warning "Low space in backup directory"
fi
# Check target NFS space
TARGET_SPACE=$(df "$TARGET_PATH" | tail -1 | awk '{print $4}')
if [ "$TARGET_SPACE" -gt 1000000 ]; then
increment_passed "Sufficient space in target NFS location"
else
increment_warning "Low space in target NFS location"
fi
# Step 12: Final validation summary
log "Step 12: Final validation summary"
log ""
log "=== VALIDATION RESULTS ==="
log "Passed: $PASSED"
log "Failed: $FAILED"
log "Warnings: $WARNINGS"
log ""
if [ "$FAILED" -eq 0 ]; then
log_success "All critical validations passed!"
log ""
log "=== BACKUP INFORMATION ==="
log "Primary backup: $BACKUP_FILE"
log "Secondary backup: $SECONDARY_BACKUP"
log "Backup size: ${BACKUP_SIZE} bytes"
log "Files backed up: $BACKUP_CONTENTS"
log ""
log "=== MIGRATION READY ==="
log "✅ Source Vaultwarden is healthy and accessible"
log "✅ Complete backup created and verified"
log "✅ Target NFS location is accessible"
log "✅ Docker Swarm prerequisites are met"
log "✅ Stack file syntax is valid"
log ""
log "You can now proceed with the migration using:"
log "sudo ./scripts/migrate_vaultwarden_sqlite.sh"
log ""
log_success "Pre-validation completed successfully!"
else
log_error "Validation failed with $FAILED critical errors"
log "Please fix the issues above before proceeding with migration"
exit 1
fi