297 lines
7.7 KiB
Markdown
297 lines
7.7 KiB
Markdown
# Database Migration Plan: SQLite → PostgreSQL
|
|
|
|
## Overview
|
|
This document outlines the migration strategy for moving from SQLite (development) to PostgreSQL (production) for both the Backend API and CMS.
|
|
|
|
## Current State
|
|
- **Backend**: Uses SQLite with TypeORM entities
|
|
- **CMS**: Uses SQLite with Strapi's internal database
|
|
- **Development**: Both use local SQLite files
|
|
- **Production**: Need PostgreSQL for scalability and reliability
|
|
|
|
## Migration Steps
|
|
|
|
### Phase 1: Database Schema Preparation
|
|
|
|
#### 1.1 Update Backend Database Configuration
|
|
```typescript
|
|
// backend/src/config/database.config.ts
|
|
export default () => ({
|
|
database: {
|
|
type: process.env.DATABASE_TYPE || 'sqlite',
|
|
host: process.env.DATABASE_HOST || 'localhost',
|
|
port: parseInt(process.env.DATABASE_PORT, 10) || 5432,
|
|
username: process.env.DATABASE_USERNAME || 'placebo_user',
|
|
password: process.env.DATABASE_PASSWORD || 'placebo_password',
|
|
database: process.env.DATABASE_NAME || 'placebo_backend_db',
|
|
synchronize: process.env.NODE_ENV !== 'production',
|
|
logging: process.env.NODE_ENV !== 'production',
|
|
entities: [__dirname + '/../**/*.entity{.ts,.js}'],
|
|
migrations: [__dirname + '/../migrations/*{.ts,.js}'],
|
|
cli: {
|
|
migrationsDir: 'src/migrations',
|
|
},
|
|
},
|
|
});
|
|
```
|
|
|
|
#### 1.2 Create TypeORM Migrations
|
|
```bash
|
|
# Generate migration for existing schema
|
|
cd backend
|
|
npm run typeorm:generate-migration --name=InitialSchema
|
|
|
|
# Create migration files for PostgreSQL compatibility
|
|
npm run typeorm:create-migration --name=AddPostgresSupport
|
|
```
|
|
|
|
#### 1.3 Update CMS Database Configuration
|
|
```javascript
|
|
// cms/cms/config/database.js
|
|
module.exports = ({ env }) => ({
|
|
connection: {
|
|
client: env('DATABASE_CLIENT', 'sqlite'),
|
|
connection: {
|
|
host: env('DATABASE_HOST', 'localhost'),
|
|
port: env.int('DATABASE_PORT', 5432),
|
|
database: env('DATABASE_NAME', 'placebo_cms_db'),
|
|
user: env('DATABASE_USERNAME', 'placebo_user'),
|
|
password: env('DATABASE_PASSWORD', 'placebo_password'),
|
|
ssl: env.bool('DATABASE_SSL', false),
|
|
},
|
|
debug: false,
|
|
},
|
|
});
|
|
```
|
|
|
|
### Phase 2: Data Migration
|
|
|
|
#### 2.1 Export SQLite Data
|
|
```bash
|
|
# Export Backend SQLite data
|
|
sqlite3 backend/data.db .dump > backend-data.sql
|
|
|
|
# Export CMS SQLite data
|
|
sqlite3 cms/cms/.tmp/data.db .dump > cms-data.sql
|
|
```
|
|
|
|
#### 2.2 Transform SQL for PostgreSQL
|
|
Create transformation scripts:
|
|
```python
|
|
# scripts/transform-sqlite-to-postgres.py
|
|
import re
|
|
|
|
def transform_sqlite_to_postgres(sqlite_sql):
|
|
# Remove SQLite-specific syntax
|
|
sqlite_sql = re.sub(r'AUTOINCREMENT', 'SERIAL', sqlite_sql)
|
|
sqlite_sql = re.sub(r'INTEGER PRIMARY KEY', 'SERIAL PRIMARY KEY', sqlite_sql)
|
|
sqlite_sql = re.sub(r'BLOB', 'BYTEA', sqlite_sql)
|
|
sqlite_sql = re.sub(r'DATETIME', 'TIMESTAMP', sqlite_sql)
|
|
return sqlite_sql
|
|
```
|
|
|
|
#### 2.3 Import to PostgreSQL
|
|
```bash
|
|
# Create databases
|
|
psql -U placebo_user -h localhost -d placebo_db -c "CREATE DATABASE placebo_backend_db;"
|
|
psql -U placebo_user -h localhost -d placebo_db -c "CREATE DATABASE placebo_cms_db;"
|
|
|
|
# Import transformed data
|
|
psql -U placebo_user -h localhost -d placebo_backend_db -f transformed-backend-data.sql
|
|
psql -U placebo_user -h localhost -d placebo_cms_db -f transformed-cms-data.sql
|
|
```
|
|
|
|
### Phase 3: Application Updates
|
|
|
|
#### 3.1 Update Environment Variables
|
|
Create `.env.production` files:
|
|
```bash
|
|
# backend/.env.production
|
|
DATABASE_TYPE=postgres
|
|
DATABASE_HOST=postgres
|
|
DATABASE_PORT=5432
|
|
DATABASE_USERNAME=placebo_user
|
|
DATABASE_PASSWORD=${POSTGRES_PASSWORD}
|
|
DATABASE_NAME=placebo_backend_db
|
|
DATABASE_SYNCHRONIZE=false
|
|
|
|
# cms/cms/.env.production
|
|
DATABASE_CLIENT=postgres
|
|
DATABASE_HOST=postgres
|
|
DATABASE_PORT=5432
|
|
DATABASE_NAME=placebo_cms_db
|
|
DATABASE_USERNAME=placebo_user
|
|
DATABASE_PASSWORD=${POSTGRES_PASSWORD}
|
|
DATABASE_SSL=false
|
|
```
|
|
|
|
#### 3.2 Update Docker Configuration
|
|
Update `docker-compose.yml` to use PostgreSQL for both services.
|
|
|
|
### Phase 4: Testing
|
|
|
|
#### 4.1 Local Testing with Docker Compose
|
|
```bash
|
|
# Start services with PostgreSQL
|
|
docker-compose up -d postgres backend cms frontend
|
|
|
|
# Run data migration
|
|
./scripts/migrate-data.sh
|
|
|
|
# Test endpoints
|
|
curl http://localhost:3000/health
|
|
curl http://localhost:1337/_health
|
|
```
|
|
|
|
#### 4.2 Data Validation
|
|
```sql
|
|
-- Verify data counts match
|
|
SELECT 'Backend Articles', COUNT(*) FROM articles
|
|
UNION ALL
|
|
SELECT 'CMS Content Types', COUNT(*) FROM strapi_content_types;
|
|
|
|
-- Verify relationships
|
|
SELECT a.title, COUNT(c.id) as comment_count
|
|
FROM articles a
|
|
LEFT JOIN comments c ON a.id = c.article_id
|
|
GROUP BY a.id;
|
|
```
|
|
|
|
### Phase 5: Production Deployment
|
|
|
|
#### 5.1 Create Migration Script
|
|
```bash
|
|
#!/bin/bash
|
|
# scripts/deploy-migration.sh
|
|
|
|
set -e
|
|
|
|
echo "Starting database migration..."
|
|
|
|
# Backup existing PostgreSQL data
|
|
pg_dump -U placebo_user -h ${PRODUCTION_DB_HOST} -d placebo_backend_db > backup-$(date +%Y%m%d).sql
|
|
|
|
# Run migrations
|
|
npm run typeorm:migration:run
|
|
|
|
# Verify migration
|
|
npm run typeorm:query "SELECT version_num FROM typeorm_migrations ORDER BY version_num DESC LIMIT 1;"
|
|
|
|
echo "Migration completed successfully!"
|
|
```
|
|
|
|
#### 5.2 Rollback Plan
|
|
```bash
|
|
#!/bin/bash
|
|
# scripts/rollback-migration.sh
|
|
|
|
set -e
|
|
|
|
echo "Starting rollback..."
|
|
|
|
# Restore from backup
|
|
psql -U placebo_user -h ${PRODUCTION_DB_HOST} -d placebo_backend_db -f backup-${BACKUP_DATE}.sql
|
|
|
|
# Revert environment variables
|
|
export DATABASE_TYPE=sqlite
|
|
export DATABASE_NAME=./data.db
|
|
|
|
echo "Rollback completed!"
|
|
```
|
|
|
|
## Migration Tools
|
|
|
|
### 1. SQLite to PostgreSQL Converter
|
|
```python
|
|
# scripts/sqlite_to_postgres.py
|
|
import sqlite3
|
|
import psycopg2
|
|
import os
|
|
|
|
def migrate_table(sqlite_conn, pg_conn, table_name):
|
|
# Read from SQLite
|
|
cursor_sqlite = sqlite_conn.cursor()
|
|
cursor_sqlite.execute(f"SELECT * FROM {table_name}")
|
|
rows = cursor_sqlite.fetchall()
|
|
|
|
# Get column names
|
|
cursor_sqlite.execute(f"PRAGMA table_info({table_name})")
|
|
columns = [col[1] for col in cursor_sqlite.fetchall()]
|
|
|
|
# Insert into PostgreSQL
|
|
cursor_pg = pg_conn.cursor()
|
|
placeholders = ', '.join(['%s'] * len(columns))
|
|
columns_str = ', '.join(columns)
|
|
|
|
for row in rows:
|
|
cursor_pg.execute(
|
|
f"INSERT INTO {table_name} ({columns_str}) VALUES ({placeholders})",
|
|
row
|
|
)
|
|
|
|
pg_conn.commit()
|
|
```
|
|
|
|
### 2. Data Validation Script
|
|
```python
|
|
# scripts/validate_migration.py
|
|
def validate_counts(sqlite_conn, pg_conn, table_name):
|
|
cursor_sqlite = sqlite_conn.cursor()
|
|
cursor_sqlite.execute(f"SELECT COUNT(*) FROM {table_name}")
|
|
sqlite_count = cursor_sqlite.fetchone()[0]
|
|
|
|
cursor_pg = pg_conn.cursor()
|
|
cursor_pg.execute(f"SELECT COUNT(*) FROM {table_name}")
|
|
pg_count = cursor_pg.fetchone()[0]
|
|
|
|
return sqlite_count == pg_count
|
|
```
|
|
|
|
## Timeline
|
|
|
|
### Week 1: Preparation
|
|
- Update database configurations
|
|
- Create migration scripts
|
|
- Set up PostgreSQL locally
|
|
|
|
### Week 2: Testing
|
|
- Test migration locally
|
|
- Validate data integrity
|
|
- Performance testing
|
|
|
|
### Week 3: Staging Deployment
|
|
- Deploy to staging environment
|
|
- User acceptance testing
|
|
- Fix any issues
|
|
|
|
### Week 4: Production Deployment
|
|
- Schedule maintenance window
|
|
- Execute migration
|
|
- Monitor performance
|
|
- Rollback if needed
|
|
|
|
## Risk Mitigation
|
|
|
|
### High Risks:
|
|
1. **Data loss**: Maintain multiple backups
|
|
2. **Downtime**: Schedule during low-traffic hours
|
|
3. **Performance issues**: Monitor closely after migration
|
|
|
|
### Mitigation Strategies:
|
|
1. Complete backup before migration
|
|
2. Gradual rollout with canary deployment
|
|
3. Performance monitoring for 48 hours post-migration
|
|
|
|
## Success Criteria
|
|
1. All data migrated without loss
|
|
2. Application functionality unchanged
|
|
3. Performance equal or better than SQLite
|
|
4. Zero downtime during migration
|
|
5. All tests passing post-migration
|
|
|
|
## Post-Migration Tasks
|
|
1. Update documentation
|
|
2. Remove SQLite dependencies
|
|
3. Set up PostgreSQL monitoring
|
|
4. Schedule regular backups
|
|
5. Update deployment scripts |