FASE 7 COMPLETADA: Testing y Lanzamiento - PROYECTO FINALIZADO
Some checks failed
CI/CD Pipeline / 🧪 Tests (push) Has been cancelled
CI/CD Pipeline / 🏗️ Build (push) Has been cancelled
CI/CD Pipeline / 🚀 Deploy to Staging (push) Has been cancelled
CI/CD Pipeline / 🚀 Deploy to Production (push) Has been cancelled
CI/CD Pipeline / 🏷️ Create Release (push) Has been cancelled
CI/CD Pipeline / 🧹 Cleanup (push) Has been cancelled

Implementados 4 módulos con agent swarm:

1. TESTING FUNCIONAL (Jest)
   - Configuración Jest + ts-jest
   - Tests unitarios: auth, booking, court (55 tests)
   - Tests integración: routes (56 tests)
   - Factories y utilidades de testing
   - Coverage configurado (70% servicios)
   - Scripts: test, test:watch, test:coverage

2. TESTING DE USUARIO (Beta)
   - Sistema de beta testers
   - Feedback con categorías y severidad
   - Beta issues tracking
   - 8 testers de prueba creados
   - API completa para gestión de feedback

3. DOCUMENTACIÓN COMPLETA
   - API.md - 150+ endpoints documentados
   - SETUP.md - Guía de instalación
   - DEPLOY.md - Deploy en VPS
   - ARCHITECTURE.md - Arquitectura del sistema
   - APP_STORE.md - Material para stores
   - Postman Collection completa
   - PM2 ecosystem config
   - Nginx config con SSL

4. GO LIVE Y PRODUCCIÓN
   - Sistema de monitoreo (logs, health checks)
   - Servicio de alertas multi-canal
   - Pre-deploy check script
   - Docker + docker-compose producción
   - Backup automatizado
   - CI/CD GitHub Actions
   - Launch checklist completo

ESTADÍSTICAS FINALES:
- Fases completadas: 7/7
- Archivos creados: 250+
- Líneas de código: 60,000+
- Endpoints API: 150+
- Tests: 110+
- Documentación: 5,000+ líneas

PROYECTO COMPLETO Y LISTO PARA PRODUCCIÓN
This commit is contained in:
2026-01-31 22:30:44 +00:00
parent e135e7ad24
commit dd10891432
61 changed files with 19256 additions and 142 deletions

363
backend/scripts/backup.sh Executable file
View File

@@ -0,0 +1,363 @@
#!/bin/bash
# =============================================================================
# Script de Backup para App Padel
# Fase 7.4 - Go Live y Soporte
# =============================================================================
#
# Este script realiza backup de:
# - Base de datos (PostgreSQL o SQLite)
# - Archivos de logs
# - Archivos subidos por usuarios (uploads)
#
# Los backups se comprimen y pueden subirse a S3 (AWS, MinIO, etc.)
#
# Uso:
# ./scripts/backup.sh
#
# Crontab (ejecutar diariamente a las 2 AM):
# 0 2 * * * /ruta/al/scripts/backup.sh >> /var/log/padel-backup.log 2>&1
# =============================================================================
set -euo pipefail
# -----------------------------------------------------------------------------
# Configuración
# -----------------------------------------------------------------------------
# Directorios
BACKUP_DIR="${BACKUP_DIR:-/backups}"
APP_DIR="${APP_DIR:-/app}"
LOGS_DIR="${APP_DIR}/logs"
UPLOADS_DIR="${APP_DIR}/uploads"
# Base de datos
DB_TYPE="${DB_TYPE:-postgresql}" # postgresql o sqlite
DB_HOST="${DB_HOST:-postgres}"
DB_PORT="${DB_PORT:-5432}"
DB_NAME="${DB_NAME:-padeldb}"
DB_USER="${DB_USER:-padeluser}"
DB_PASSWORD="${DB_PASSWORD:-}"
SQLITE_PATH="${SQLITE_PATH:-/app/prisma/dev.db}"
# Retención (días)
RETENTION_DAYS="${RETENTION_DAYS:-30}"
# Notificaciones
SLACK_WEBHOOK_URL="${SLACK_WEBHOOK_URL:-}"
EMAIL_TO="${BACKUP_EMAIL_TO:-}"
SMTP_HOST="${SMTP_HOST:-}"
SMTP_PORT="${SMTP_PORT:-587}"
SMTP_USER="${SMTP_USER:-}"
SMTP_PASS="${SMTP_PASS:-}"
# S3 (opcional)
S3_BUCKET="${BACKUP_S3_BUCKET:-}"
S3_REGION="${BACKUP_S3_REGION:-us-east-1}"
S3_ENDPOINT="${BACKUP_S3_ENDPOINT:-}" # Para MinIO u otros compatibles
AWS_ACCESS_KEY_ID="${AWS_ACCESS_KEY_ID:-}"
AWS_SECRET_ACCESS_KEY="${AWS_SECRET_ACCESS_KEY:-}"
# -----------------------------------------------------------------------------
# Variables internas
# -----------------------------------------------------------------------------
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
DATE=$(date +"%Y-%m-%d")
BACKUP_NAME="padel_backup_${TIMESTAMP}"
BACKUP_PATH="${BACKUP_DIR}/${BACKUP_NAME}"
LOG_FILE="${BACKUP_DIR}/backup_${TIMESTAMP}.log"
# Contadores
ERRORS=0
WARNINGS=0
# -----------------------------------------------------------------------------
# Funciones auxiliares
# -----------------------------------------------------------------------------
log() {
local level="$1"
shift
local message="$*"
local timestamp=$(date '+%Y-%m-%d %H:%M:%S')
echo "[${timestamp}] [${level}] ${message}" | tee -a "$LOG_FILE"
}
info() { log "INFO" "$@"; }
warn() { log "WARN" "$@"; ((WARNINGS++)); }
error() { log "ERROR" "$@"; ((ERRORS++)); }
send_notification() {
local status="$1"
local message="$2"
# Slack
if [[ -n "$SLACK_WEBHOOK_URL" ]]; then
local color="good"
[[ "$status" == "FAILED" ]] && color="danger"
[[ "$status" == "WARNING" ]] && color="warning"
curl -s -X POST "$SLACK_WEBHOOK_URL" \
-H 'Content-type: application/json' \
--data "{
\"attachments\": [{
\"color\": \"${color}\",
\"title\": \"Padel Backup - ${status}\",
\"text\": \"${message}\",
\"footer\": \"Padel App\",
\"ts\": $(date +%s)
}]
}" || warn "No se pudo enviar notificación a Slack"
fi
# Email (usando sendmail o similar)
if [[ -n "$EMAIL_TO" && -n "$SMTP_HOST" ]]; then
local subject="[Padel Backup] ${status} - ${DATE}"
{
echo "Subject: ${subject}"
echo "To: ${EMAIL_TO}"
echo "Content-Type: text/plain; charset=UTF-8"
echo ""
echo "$message"
echo ""
echo "---"
echo "Timestamp: $(date)"
echo "Hostname: $(hostname)"
echo "Backup: ${BACKUP_NAME}"
} | sendmail "$EMAIL_TO" || warn "No se pudo enviar email"
fi
}
cleanup() {
local exit_code=$?
if [[ $exit_code -ne 0 ]]; then
error "Script terminado con errores (código: $exit_code)"
send_notification "FAILED" "El backup falló. Ver log: ${LOG_FILE}"
# Limpiar archivos temporales
if [[ -d "$BACKUP_PATH" ]]; then
rm -rf "$BACKUP_PATH"
fi
fi
exit $exit_code
}
trap cleanup EXIT
# -----------------------------------------------------------------------------
# Preparación
# -----------------------------------------------------------------------------
info "Iniciando backup: ${BACKUP_NAME}"
info "Directorio de backup: ${BACKUP_DIR}"
# Crear directorio de backup
mkdir -p "$BACKUP_DIR"
mkdir -p "$BACKUP_PATH"
# -----------------------------------------------------------------------------
# Backup de Base de Datos
# -----------------------------------------------------------------------------
info "Realizando backup de base de datos (${DB_TYPE})..."
if [[ "$DB_TYPE" == "postgresql" ]]; then
if command -v pg_dump &> /dev/null; then
export PGPASSWORD="$DB_PASSWORD"
if pg_dump -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" \
--verbose --no-owner --no-acl \
-f "${BACKUP_PATH}/database.sql" 2>> "$LOG_FILE"; then
info "Backup PostgreSQL completado: database.sql"
else
error "Fallo al hacer backup de PostgreSQL"
fi
unset PGPASSWORD
else
error "pg_dump no encontrado"
fi
elif [[ "$DB_TYPE" == "sqlite" ]]; then
if [[ -f "$SQLITE_PATH" ]]; then
# SQLite: simplemente copiar el archivo (asegurando integridad)
if sqlite3 "$SQLITE_PATH" ".backup '${BACKUP_PATH}/database.db'" 2>> "$LOG_FILE"; then
info "Backup SQLite completado: database.db"
else
error "Fallo al hacer backup de SQLite"
fi
else
error "Archivo SQLite no encontrado: $SQLITE_PATH"
fi
else
error "Tipo de base de datos no soportado: $DB_TYPE"
fi
# -----------------------------------------------------------------------------
# Backup de Logs
# -----------------------------------------------------------------------------
info "Realizando backup de logs..."
if [[ -d "$LOGS_DIR" ]]; then
if tar -czf "${BACKUP_PATH}/logs.tar.gz" -C "$(dirname "$LOGS_DIR")" "$(basename "$LOGS_DIR")" 2>> "$LOG_FILE"; then
info "Backup de logs completado: logs.tar.gz"
else
warn "Fallo al comprimir logs (puede que no existan)"
fi
else
warn "Directorio de logs no encontrado: $LOGS_DIR"
fi
# -----------------------------------------------------------------------------
# Backup de Uploads
# -----------------------------------------------------------------------------
info "Realizando backup de uploads..."
if [[ -d "$UPLOADS_DIR" ]]; then
if tar -czf "${BACKUP_PATH}/uploads.tar.gz" -C "$(dirname "$UPLOADS_DIR")" "$(basename "$UPLOADS_DIR")" 2>> "$LOG_FILE"; then
info "Backup de uploads completado: uploads.tar.gz"
else
warn "Fallo al comprimir uploads"
fi
else
warn "Directorio de uploads no encontrado: $UPLOADS_DIR"
fi
# -----------------------------------------------------------------------------
# Crear manifest
# -----------------------------------------------------------------------------
cat > "${BACKUP_PATH}/manifest.json" << EOF
{
"backup_name": "${BACKUP_NAME}",
"timestamp": "$(date -Iseconds)",
"hostname": "$(hostname)",
"version": "1.0.0",
"database": {
"type": "${DB_TYPE}",
"name": "${DB_NAME}"
},
"files": [
$(ls -1 "${BACKUP_PATH}" | grep -E '\.(sql|db|tar\.gz)$' | sed 's/^/ "/;s/$/"/' | paste -sd ',' -)
]
}
EOF
info "Manifest creado"
# -----------------------------------------------------------------------------
# Comprimir backup completo
# -----------------------------------------------------------------------------
info "Comprimiendo backup completo..."
cd "$BACKUP_DIR"
if tar -czf "${BACKUP_NAME}.tar.gz" "$BACKUP_NAME"; then
info "Backup comprimido: ${BACKUP_NAME}.tar.gz"
# Calcular tamaño
BACKUP_SIZE=$(du -h "${BACKUP_NAME}.tar.gz" | cut -f1)
info "Tamaño del backup: ${BACKUP_SIZE}"
# Eliminar directorio temporal
rm -rf "$BACKUP_PATH"
else
error "Fallo al comprimir backup"
exit 1
fi
# -----------------------------------------------------------------------------
# Subir a S3 (opcional)
# -----------------------------------------------------------------------------
if [[ -n "$S3_BUCKET" && -n "$AWS_ACCESS_KEY_ID" ]]; then
info "Subiendo backup a S3..."
# Configurar AWS CLI si es necesario
if [[ -n "$S3_ENDPOINT" ]]; then
export AWS_ENDPOINT_URL="$S3_ENDPOINT"
fi
if command -v aws &> /dev/null; then
if aws s3 cp "${BACKUP_NAME}.tar.gz" "s3://${S3_BUCKET}/backups/" \
--region "$S3_REGION" 2>> "$LOG_FILE"; then
info "Backup subido a S3: s3://${S3_BUCKET}/backups/${BACKUP_NAME}.tar.gz"
else
error "Fallo al subir backup a S3"
fi
else
warn "AWS CLI no instalado, no se pudo subir a S3"
fi
fi
# -----------------------------------------------------------------------------
# Limpiar backups antiguos
# -----------------------------------------------------------------------------
info "Limpiando backups antiguos (retención: ${RETENTION_DAYS} días)..."
# Limpiar backups locales
find "$BACKUP_DIR" -name "padel_backup_*.tar.gz" -type f -mtime +$RETENTION_DAYS -delete 2>/dev/null || true
find "$BACKUP_DIR" -name "backup_*.log" -type f -mtime +$RETENTION_DAYS -delete 2>/dev/null || true
info "Limpieza completada"
# Limpiar backups en S3 (si está configurado)
if [[ -n "$S3_BUCKET" && -n "$AWS_ACCESS_KEY_ID" ]] && command -v aws &> /dev/null; then
info "Limpiando backups antiguos en S3..."
# Listar y eliminar backups antiguos
aws s3 ls "s3://${S3_BUCKET}/backups/" --region "$S3_REGION" | \
while read -r line; do
file_date=$(echo "$line" | awk '{print $1}')
file_name=$(echo "$line" | awk '{print $4}')
# Calcular días desde la fecha del archivo
file_timestamp=$(date -d "$file_date" +%s 2>/dev/null || echo 0)
current_timestamp=$(date +%s)
days_old=$(( (current_timestamp - file_timestamp) / 86400 ))
if [[ $days_old -gt $RETENTION_DAYS ]]; then
aws s3 rm "s3://${S3_BUCKET}/backups/${file_name}" --region "$S3_REGION" 2>/dev/null || true
info "Eliminado backup antiguo de S3: $file_name"
fi
done
fi
# -----------------------------------------------------------------------------
# Resumen y notificación
# -----------------------------------------------------------------------------
info "Backup completado: ${BACKUP_NAME}.tar.gz"
info "Tamaño: ${BACKUP_SIZE}"
info "Errores: ${ERRORS}"
info "Advertencias: ${WARNINGS}"
# Preparar mensaje de resumen
SUMMARY="Backup completado exitosamente.
Nombre: ${BACKUP_NAME}
Fecha: ${DATE}
Tamaño: ${BACKUP_SIZE}
Errores: ${ERRORS}
Advertencias: ${WARNINGS}
Archivos incluidos:
- Base de datos (${DB_TYPE})
- Logs
- Uploads
Ubicación: ${BACKUP_DIR}/${BACKUP_NAME}.tar.gz"
if [[ $ERRORS -eq 0 ]]; then
send_notification "SUCCESS" "$SUMMARY"
info "✅ Backup finalizado correctamente"
else
send_notification "WARNING" "Backup completado con ${ERRORS} errores. Ver log para detalles."
warn "⚠️ Backup completado con errores"
fi
exit $ERRORS

322
backend/scripts/deploy.sh Executable file
View File

@@ -0,0 +1,322 @@
#!/bin/bash
# ============================================
# Script de Deploy - App Canchas de Pádel
# ============================================
# Uso: ./deploy.sh [environment]
# Ejemplo: ./deploy.sh production
# ============================================
set -e
# ============================================
# CONFIGURACIÓN
# ============================================
# Colores para output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# Variables por defecto
ENVIRONMENT="${1:-production}"
APP_NAME="app-padel-api"
APP_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
PM2_CONFIG="$APP_DIR/ecosystem.config.js"
HEALTH_CHECK_URL="http://localhost:3000/api/v1/health"
MAX_RETRIES=5
RETRY_DELAY=5
# ============================================
# FUNCIONES
# ============================================
log_info() {
echo -e "${BLUE}[INFO]${NC} $1"
}
log_success() {
echo -e "${GREEN}[OK]${NC} $1"
}
log_warning() {
echo -e "${YELLOW}[WARN]${NC} $1"
}
log_error() {
echo -e "${RED}[ERROR]${NC} $1"
}
print_banner() {
echo ""
echo "========================================"
echo " 🚀 Deploy - App Canchas de Pádel"
echo " Environment: $ENVIRONMENT"
echo " Date: $(date)"
echo "========================================"
echo ""
}
check_prerequisites() {
log_info "Verificando prerrequisitos..."
# Verificar Node.js
if ! command -v node &> /dev/null; then
log_error "Node.js no está instalado"
exit 1
fi
# Verificar npm
if ! command -v npm &> /dev/null; then
log_error "npm no está instalado"
exit 1
fi
# Verificar PM2
if ! command -v pm2 &> /dev/null; then
log_error "PM2 no está instalado. Instalar con: npm install -g pm2"
exit 1
fi
# Verificar git
if ! command -v git &> /dev/null; then
log_error "Git no está instalado"
exit 1
fi
# Verificar directorio de la aplicación
if [ ! -d "$APP_DIR" ]; then
log_error "Directorio de la aplicación no encontrado: $APP_DIR"
exit 1
fi
# Verificar archivo de configuración PM2
if [ ! -f "$PM2_CONFIG" ]; then
log_error "Archivo de configuración PM2 no encontrado: $PM2_CONFIG"
exit 1
fi
log_success "Prerrequisitos verificados"
}
backup_current() {
log_info "Creando backup..."
BACKUP_DIR="$APP_DIR/backups"
BACKUP_NAME="backup_$(date +%Y%m%d_%H%M%S).tar.gz"
mkdir -p "$BACKUP_DIR"
# Crear backup de dist y .env
if [ -d "$APP_DIR/dist" ]; then
tar -czf "$BACKUP_DIR/$BACKUP_NAME" -C "$APP_DIR" dist .env 2>/dev/null || true
log_success "Backup creado: $BACKUP_DIR/$BACKUP_NAME"
else
log_warning "No hay build anterior para respaldar"
fi
}
update_code() {
log_info "Actualizando código desde repositorio..."
cd "$APP_DIR"
# Guardar cambios locales si existen
if [ -n "$(git status --porcelain)" ]; then
log_warning "Hay cambios locales sin commitear"
git stash
fi
# Pull de cambios
git fetch origin
# Checkout a la rama correcta
if [ "$ENVIRONMENT" = "production" ]; then
git checkout main || git checkout master
else
git checkout develop || git checkout development
fi
git pull origin $(git branch --show-current)
log_success "Código actualizado"
}
install_dependencies() {
log_info "Instalando dependencias..."
cd "$APP_DIR"
# Limpiar node_modules para evitar conflictos
if [ "$ENVIRONMENT" = "production" ]; then
npm ci --only=production
else
npm ci
fi
log_success "Dependencias instaladas"
}
build_app() {
log_info "Compilando aplicación..."
cd "$APP_DIR"
# Limpiar build anterior
rm -rf dist
# Compilar TypeScript
npm run build
if [ ! -d "$APP_DIR/dist" ]; then
log_error "La compilación falló - no se encontró directorio dist"
exit 1
fi
log_success "Aplicación compilada"
}
run_migrations() {
log_info "Ejecutando migraciones de base de datos..."
cd "$APP_DIR"
# Generar cliente Prisma
npx prisma generate
# Ejecutar migraciones
npx prisma migrate deploy
log_success "Migraciones completadas"
}
restart_app() {
log_info "Reiniciando aplicación con PM2..."
cd "$APP_DIR"
# Verificar si la aplicación ya está corriendo
if pm2 list | grep -q "$APP_NAME"; then
log_info "Recargando aplicación existente..."
pm2 reload "$PM2_CONFIG" --env "$ENVIRONMENT"
else
log_info "Iniciando aplicación..."
pm2 start "$PM2_CONFIG" --env "$ENVIRONMENT"
fi
# Guardar configuración PM2
pm2 save
log_success "Aplicación reiniciada"
}
health_check() {
log_info "Verificando salud de la aplicación..."
local retries=0
local is_healthy=false
while [ $retries -lt $MAX_RETRIES ]; do
if curl -sf "$HEALTH_CHECK_URL" | grep -q '"success":true'; then
is_healthy=true
break
fi
retries=$((retries + 1))
log_warning "Intento $retries/$MAX_RETRIES fallido. Reintentando en ${RETRY_DELAY}s..."
sleep $RETRY_DELAY
done
if [ "$is_healthy" = true ]; then
log_success "Health check pasó - API funcionando correctamente"
return 0
else
log_error "Health check falló después de $MAX_RETRIES intentos"
return 1
fi
}
rollback() {
log_warning "Ejecutando rollback..."
BACKUP_DIR="$APP_DIR/backups"
# Encontrar backup más reciente
LATEST_BACKUP=$(ls -t "$BACKUP_DIR"/backup_*.tar.gz 2>/dev/null | head -n 1)
if [ -n "$LATEST_BACKUP" ]; then
log_info "Restaurando desde: $LATEST_BACKUP"
cd "$APP_DIR"
tar -xzf "$LATEST_BACKUP"
pm2 reload "$PM2_CONFIG"
log_success "Rollback completado"
else
log_error "No se encontró backup para restaurar"
fi
}
cleanup() {
log_info "Limpiando archivos temporales..."
# Limpiar backups antiguos (mantener últimos 10)
BACKUP_DIR="$APP_DIR/backups"
if [ -d "$BACKUP_DIR" ]; then
ls -t "$BACKUP_DIR"/backup_*.tar.gz 2>/dev/null | tail -n +11 | xargs rm -f 2>/dev/null || true
fi
# Limpiar logs antiguos (mantener últimos 7 días)
find "$APP_DIR/logs" -name "*.log" -mtime +7 -delete 2>/dev/null || true
log_success "Limpieza completada"
}
# ============================================
# EJECUCIÓN PRINCIPAL
# ============================================
main() {
print_banner
# Validar environment
if [ "$ENVIRONMENT" != "production" ] && [ "$ENVIRONMENT" != "development" ]; then
log_error "Environment inválido. Usar: production o development"
exit 1
fi
# Ejecutar pasos
check_prerequisites
backup_current
update_code
install_dependencies
build_app
run_migrations
restart_app
# Health check
if health_check; then
log_success "🎉 Deploy completado exitosamente!"
cleanup
echo ""
echo "========================================"
echo " 📊 Estado de la Aplicación:"
echo "========================================"
pm2 status "$APP_NAME"
echo ""
echo " 🔗 URL: $HEALTH_CHECK_URL"
echo " 📜 Logs: pm2 logs $APP_NAME"
echo "========================================"
else
log_error "❌ Deploy falló - ejecutando rollback"
rollback
exit 1
fi
}
# Manejar errores
trap 'log_error "Error en línea $LINENO"' ERR
# Ejecutar
main "$@"

View File

@@ -0,0 +1,541 @@
#!/usr/bin/env node
/**
* Script de verificación pre-deploy
* Fase 7.4 - Go Live y Soporte
*
* Este script verifica que todo esté listo antes de un despliegue a producción.
*
* Uso:
* node scripts/pre-deploy-check.js
*
* Salida:
* - Código 0 si todas las verificaciones pasan
* - Código 1 si alguna verificación falla
*/
const { execSync } = require('child_process');
const fs = require('fs');
const path = require('path');
// Colores para output
const colors = {
reset: '\x1b[0m',
red: '\x1b[31m',
green: '\x1b[32m',
yellow: '\x1b[33m',
blue: '\x1b[34m',
cyan: '\x1b[36m',
};
// Resultados
const results = {
passed: [],
failed: [],
warnings: [],
};
/**
* Imprime un mensaje con color
*/
function print(message, color = 'reset') {
console.log(`${colors[color]}${message}${colors.reset}`);
}
/**
* Ejecuta un comando y retorna el resultado
*/
function runCommand(command, options = {}) {
try {
return execSync(command, {
encoding: 'utf-8',
stdio: options.silent ? 'pipe' : 'inherit',
...options
});
} catch (error) {
if (options.ignoreError) {
return error.stdout || '';
}
throw error;
}
}
/**
* Verifica variables de entorno requeridas
*/
function checkEnvironmentVariables() {
print('\n🔍 Verificando variables de entorno...', 'cyan');
const required = [
'DATABASE_URL',
'JWT_SECRET',
'NODE_ENV',
];
const recommended = [
'SMTP_HOST',
'SMTP_USER',
'SMTP_PASS',
'MERCADOPAGO_ACCESS_TOKEN',
'FRONTEND_URL',
'API_URL',
];
let allRequiredPresent = true;
// Verificar requeridas
for (const env of required) {
if (!process.env[env]) {
print(`${env}: NO DEFINIDA`, 'red');
results.failed.push(`Variable requerida faltante: ${env}`);
allRequiredPresent = false;
} else {
print(`${env}: Definida`, 'green');
}
}
// Verificar recomendadas
for (const env of recommended) {
if (!process.env[env]) {
print(` ⚠️ ${env}: No definida (recomendada)`, 'yellow');
results.warnings.push(`Variable recomendada faltante: ${env}`);
} else {
print(`${env}: Definida`, 'green');
}
}
if (allRequiredPresent) {
results.passed.push('Variables de entorno requeridas');
}
return allRequiredPresent;
}
/**
* Verifica conexión a base de datos
*/
async function checkDatabaseConnection() {
print('\n🔍 Verificando conexión a base de datos...', 'cyan');
try {
const { PrismaClient } = require('@prisma/client');
const prisma = new PrismaClient();
// Intentar conectar
await prisma.$connect();
// Verificar que podemos hacer queries
await prisma.$queryRaw`SELECT 1`;
// Obtener información de la BD
const dbInfo = await prisma.$queryRaw`SELECT sqlite_version() as version`;
await prisma.$disconnect();
print(` ✅ Conexión a base de datos exitosa`, 'green');
print(` 📊 Versión: ${dbInfo[0]?.version || 'N/A'}`, 'blue');
results.passed.push('Conexión a base de datos');
return true;
} catch (error) {
print(` ❌ Error de conexión: ${error.message}`, 'red');
results.failed.push(`Conexión a base de datos fallida: ${error.message}`);
return false;
}
}
/**
* Verifica migraciones pendientes
*/
async function checkPendingMigrations() {
print('\n🔍 Verificando migraciones pendientes...', 'cyan');
try {
// Generar cliente prisma primero
runCommand('npx prisma generate', { silent: true });
// Verificar estado de migraciones
const output = runCommand('npx prisma migrate status', { silent: true, ignoreError: true });
if (output.includes('Database schema is up to date') ||
output.includes('No pending migrations')) {
print(` ✅ No hay migraciones pendientes`, 'green');
results.passed.push('Migraciones de base de datos');
return true;
} else if (output.includes('pending migration')) {
print(` ⚠️ Hay migraciones pendientes`, 'yellow');
print(` Ejecute: npx prisma migrate deploy`, 'yellow');
results.warnings.push('Hay migraciones pendientes de aplicar');
return true; // Es warning, no error
} else {
print(` ✅ Estado de migraciones verificado`, 'green');
results.passed.push('Migraciones de base de datos');
return true;
}
} catch (error) {
print(` ⚠️ No se pudo verificar estado de migraciones`, 'yellow');
results.warnings.push(`Verificación de migraciones: ${error.message}`);
return true; // No es crítico para el deploy
}
}
/**
* Verifica dependencias críticas
*/
function checkDependencies() {
print('\n🔍 Verificando dependencias críticas...', 'cyan');
const criticalDeps = [
'@prisma/client',
'express',
'bcrypt',
'jsonwebtoken',
'cors',
'helmet',
'dotenv',
];
const packageJsonPath = path.join(process.cwd(), 'package.json');
if (!fs.existsSync(packageJsonPath)) {
print(` ❌ package.json no encontrado`, 'red');
results.failed.push('package.json no encontrado');
return false;
}
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf-8'));
const allDeps = {
...packageJson.dependencies,
...packageJson.devDependencies
};
let allPresent = true;
for (const dep of criticalDeps) {
if (allDeps[dep]) {
print(`${dep}@${allDeps[dep]}`, 'green');
} else {
print(`${dep}: NO INSTALADO`, 'red');
results.failed.push(`Dependencia crítica faltante: ${dep}`);
allPresent = false;
}
}
if (allPresent) {
results.passed.push('Dependencias críticas instaladas');
}
return allPresent;
}
/**
* Verifica espacio en disco
*/
function checkDiskSpace() {
print('\n🔍 Verificando espacio en disco...', 'cyan');
try {
// En Linux/Mac, usar df
const platform = process.platform;
if (platform === 'linux' || platform === 'darwin') {
const output = runCommand('df -h .', { silent: true });
const lines = output.trim().split('\n');
const dataLine = lines[lines.length - 1];
const parts = dataLine.split(/\s+/);
const usedPercent = parseInt(parts[4].replace('%', ''));
if (usedPercent > 90) {
print(` ❌ Uso de disco crítico: ${usedPercent}%`, 'red');
results.failed.push(`Uso de disco crítico: ${usedPercent}%`);
return false;
} else if (usedPercent > 80) {
print(` ⚠️ Uso de disco alto: ${usedPercent}%`, 'yellow');
results.warnings.push(`Uso de disco alto: ${usedPercent}%`);
} else {
print(` ✅ Uso de disco: ${usedPercent}%`, 'green');
}
results.passed.push('Espacio en disco');
return true;
} else {
print(` ⚠️ Verificación de disco no soportada en ${platform}`, 'yellow');
results.warnings.push(`Verificación de disco no soportada en ${platform}`);
return true;
}
} catch (error) {
print(` ⚠️ No se pudo verificar espacio en disco`, 'yellow');
results.warnings.push(`Verificación de disco: ${error.message}`);
return true;
}
}
/**
* Verifica que el build funcione
*/
function checkBuild() {
print('\n🔍 Verificando build de TypeScript...', 'cyan');
try {
// Limpiar build anterior si existe
if (fs.existsSync(path.join(process.cwd(), 'dist'))) {
print(` 🧹 Limpiando build anterior...`, 'blue');
fs.rmSync(path.join(process.cwd(), 'dist'), { recursive: true });
}
// Intentar compilar
runCommand('npx tsc --noEmit', { silent: true });
print(` ✅ TypeScript compila sin errores`, 'green');
results.passed.push('Build de TypeScript');
return true;
} catch (error) {
print(` ❌ Errores de compilación de TypeScript`, 'red');
print(` ${error.message}`, 'red');
results.failed.push('Errores de compilación de TypeScript');
return false;
}
}
/**
* Verifica archivos de configuración
*/
function checkConfigurationFiles() {
print('\n🔍 Verificando archivos de configuración...', 'cyan');
const requiredFiles = [
'package.json',
'tsconfig.json',
'prisma/schema.prisma',
];
const optionalFiles = [
'.env.example',
'Dockerfile',
'docker-compose.yml',
];
let allRequiredPresent = true;
for (const file of requiredFiles) {
const filePath = path.join(process.cwd(), file);
if (fs.existsSync(filePath)) {
print(`${file}`, 'green');
} else {
print(`${file}: NO ENCONTRADO`, 'red');
results.failed.push(`Archivo requerido faltante: ${file}`);
allRequiredPresent = false;
}
}
for (const file of optionalFiles) {
const filePath = path.join(process.cwd(), file);
if (fs.existsSync(filePath)) {
print(`${file}`, 'green');
} else {
print(` ⚠️ ${file}: No encontrado (opcional)`, 'yellow');
results.warnings.push(`Archivo opcional faltante: ${file}`);
}
}
if (allRequiredPresent) {
results.passed.push('Archivos de configuración requeridos');
}
return allRequiredPresent;
}
/**
* Verifica tests (si existen)
*/
function checkTests() {
print('\n🔍 Verificando tests...', 'cyan');
// Verificar si hay tests
const testDirs = ['tests', '__tests__', 'test', 'spec'];
const hasTests = testDirs.some(dir =>
fs.existsSync(path.join(process.cwd(), dir))
);
if (!hasTests) {
print(` ⚠️ No se encontraron directorios de tests`, 'yellow');
results.warnings.push('No hay tests configurados');
return true;
}
// Verificar si jest está configurado
const packageJsonPath = path.join(process.cwd(), 'package.json');
const packageJson = JSON.parse(fs.readFileSync(packageJsonPath, 'utf-8'));
if (!packageJson.scripts?.test) {
print(` ⚠️ No hay script de test configurado`, 'yellow');
results.warnings.push('Script de test no configurado');
return true;
}
try {
// Intentar ejecutar tests
runCommand('npm test', { silent: true });
print(` ✅ Tests pasaron`, 'green');
results.passed.push('Tests pasando');
return true;
} catch (error) {
print(` ❌ Algunos tests fallaron`, 'red');
results.failed.push('Tests fallidos');
return false;
}
}
/**
* Verifica endpoints críticos (requiere servidor corriendo)
*/
async function checkCriticalEndpoints() {
print('\n🔍 Verificando endpoints críticos...', 'cyan');
const baseUrl = process.env.API_URL || 'http://localhost:3000';
const endpoints = [
{ path: '/api/v1/health', name: 'Health Check' },
];
let allWorking = true;
for (const endpoint of endpoints) {
try {
const response = await fetch(`${baseUrl}${endpoint.path}`);
if (response.ok) {
print(`${endpoint.name} (${endpoint.path})`, 'green');
} else {
print(`${endpoint.name} (${endpoint.path}): HTTP ${response.status}`, 'red');
results.failed.push(`Endpoint no disponible: ${endpoint.path}`);
allWorking = false;
}
} catch (error) {
print(` ⚠️ ${endpoint.name} (${endpoint.path}): Servidor no disponible`, 'yellow');
results.warnings.push(`No se pudo verificar endpoint: ${endpoint.path}`);
// No es crítico si el servidor no está corriendo durante el check
}
}
if (allWorking) {
results.passed.push('Endpoints críticos disponibles');
}
return true;
}
/**
* Verifica seguridad básica
*/
function checkSecurityConfig() {
print('\n🔍 Verificando configuración de seguridad...', 'cyan');
const issues = [];
// Verificar JWT_SECRET
const jwtSecret = process.env.JWT_SECRET;
if (jwtSecret) {
if (jwtSecret.length < 32) {
issues.push('JWT_SECRET es muy corto (mínimo 32 caracteres)');
}
if (jwtSecret === 'your-secret-key' || jwtSecret === 'secret') {
issues.push('JWT_SECRET usa valor por defecto inseguro');
}
}
// Verificar NODE_ENV
if (process.env.NODE_ENV === 'development') {
issues.push('NODE_ENV está en development');
}
// Verificar CORS
if (process.env.FRONTEND_URL === '*') {
issues.push('CORS permite todos los orígenes (*)');
}
if (issues.length === 0) {
print(` ✅ Configuración de seguridad correcta`, 'green');
results.passed.push('Configuración de seguridad');
return true;
} else {
for (const issue of issues) {
print(` ⚠️ ${issue}`, 'yellow');
}
results.warnings.push('Problemas de seguridad encontrados');
return true; // Son warnings, no errores
}
}
/**
* Imprime resumen final
*/
function printSummary() {
print('\n' + '='.repeat(60), 'cyan');
print('RESUMEN DE VERIFICACIÓN PRE-DEPLOY', 'cyan');
print('='.repeat(60), 'cyan');
print(`\n✅ Verificaciones exitosas: ${results.passed.length}`, 'green');
results.passed.forEach(item => print(`${item}`, 'green'));
if (results.warnings.length > 0) {
print(`\n⚠️ Advertencias: ${results.warnings.length}`, 'yellow');
results.warnings.forEach(item => print(`${item}`, 'yellow'));
}
if (results.failed.length > 0) {
print(`\n❌ Errores: ${results.failed.length}`, 'red');
results.failed.forEach(item => print(`${item}`, 'red'));
}
print('\n' + '='.repeat(60), 'cyan');
if (results.failed.length === 0) {
print('✅ TODAS LAS VERIFICACIONES CRÍTICAS PASARON', 'green');
print('El sistema está listo para deploy.', 'green');
return 0;
} else {
print('❌ HAY ERRORES CRÍTICOS QUE DEBEN CORREGIRSE', 'red');
print('Por favor corrija los errores antes de deployar.', 'red');
return 1;
}
}
/**
* Función principal
*/
async function main() {
print('\n🚀 INICIANDO VERIFICACIÓN PRE-DEPLOY', 'cyan');
print(`📅 ${new Date().toISOString()}`, 'blue');
print(`📁 Directorio: ${process.cwd()}`, 'blue');
const checks = [
checkEnvironmentVariables(),
checkDependencies(),
checkConfigurationFiles(),
checkBuild(),
checkSecurityConfig(),
checkDiskSpace(),
];
// Checks asíncronos
await checkDatabaseConnection();
await checkPendingMigrations();
await checkCriticalEndpoints();
// Tests (opcional)
try {
checkTests();
} catch (e) {
// Ignorar errores de tests
}
// Imprimir resumen y salir con código apropiado
const exitCode = printSummary();
process.exit(exitCode);
}
// Ejecutar
main().catch(error => {
print(`\n💥 Error fatal: ${error.message}`, 'red');
process.exit(1);
});