# Quick Start: Fix Job Processing Now **Status:** ✅ Code implemented - Need DATABASE_URL configuration --- ## 🚀 Quick Fix (5 minutes) ### Step 1: Get PostgreSQL Connection String 1. Go to **Supabase Dashboard**: https://supabase.com/dashboard 2. Select your project 3. Navigate to **Settings → Database** 4. Scroll to **Connection string** section 5. Click **"URI"** tab 6. Copy the connection string (looks like): ``` postgresql://postgres.[PROJECT-REF]:[PASSWORD]@aws-0-us-central-1.pooler.supabase.com:6543/postgres ``` ### Step 2: Add to Environment **For Local Testing:** ```bash cd backend echo 'DATABASE_URL=postgresql://postgres.[PROJECT-REF]:[PASSWORD]@aws-0-us-central-1.pooler.supabase.com:6543/postgres' >> .env ``` **For Firebase Functions (Production):** ```bash # For secrets (recommended for sensitive data): firebase functions:secrets:set DATABASE_URL # Or set as environment variable in firebase.json or function configuration # See: https://firebase.google.com/docs/functions/config-env ``` ### Step 3: Test Connection ```bash cd backend npm run test:postgres ``` **Expected Output:** ``` ✅ PostgreSQL pool created ✅ Connection successful! ✅ processing_jobs table exists ✅ documents table exists 🎯 Ready to create jobs via direct PostgreSQL connection ``` ### Step 4: Test Job Creation ```bash # Get a document ID first npm run test:postgres # Then create a job for a document npm run test:job ``` ### Step 5: Build and Deploy ```bash cd backend npm run build firebase deploy --only functions ``` --- ## ✅ What This Fixes **Before:** - ❌ Jobs fail to create (PostgREST cache error) - ❌ Documents stuck in `processing_llm` - ❌ No processing happens **After:** - ✅ Jobs created via direct PostgreSQL - ✅ Bypasses PostgREST cache issues - ✅ Jobs processed by scheduled function - ✅ Documents complete successfully --- ## 🔍 Verification After deployment, test with a real upload: 1. **Upload a document** via frontend 2. **Check logs:** ```bash firebase functions:log --only api --limit 50 ``` Look for: `"Processing job created via direct PostgreSQL"` 3. **Check database:** ```sql SELECT * FROM processing_jobs WHERE status = 'pending' ORDER BY created_at DESC LIMIT 5; ``` 4. **Wait 1-2 minutes** for scheduled function to process 5. **Check document:** ```sql SELECT id, status, analysis_data FROM documents WHERE id = '[DOCUMENT-ID]'; ``` Should show: `status = 'completed'` and `analysis_data` populated --- ## 🐛 Troubleshooting ### Error: "DATABASE_URL environment variable is required" **Solution:** Make sure you added `DATABASE_URL` to `.env` or Firebase config ### Error: "Connection timeout" **Solution:** - Verify connection string is correct - Check if your IP is allowed in Supabase (Settings → Database → Connection pooling) - Try using transaction mode instead of session mode ### Error: "Authentication failed" **Solution:** - Verify password in connection string - Reset database password in Supabase if needed - Make sure you're using the pooler connection string (port 6543) ### Still Getting Cache Errors? **Solution:** The fallback to Supabase client will still work, but direct PostgreSQL should succeed first. Check logs to see which method was used. --- ## 📊 Expected Flow After Fix ``` 1. User Uploads PDF ✅ 2. GCS Upload ✅ 3. Confirm Upload ✅ 4. Job Created via Direct PostgreSQL ✅ (NEW!) 5. Scheduled Function Finds Job ✅ 6. Job Processor Executes ✅ 7. Document Updated to Completed ✅ ``` --- ## 🎯 Success Criteria You'll know it's working when: - ✅ `test:postgres` script succeeds - ✅ `test:job` script creates job - ✅ Upload creates job automatically - ✅ Scheduled function logs show jobs being processed - ✅ Documents transition from `processing_llm` → `completed` - ✅ `analysis_data` is populated --- ## 📝 Next Steps 1. ✅ Code implemented 2. ⏳ Get DATABASE_URL from Supabase 3. ⏳ Add to environment 4. ⏳ Test connection 5. ⏳ Test job creation 6. ⏳ Deploy to Firebase 7. ⏳ Verify end-to-end **Once DATABASE_URL is configured, the system will work end-to-end!**