FASE 7 COMPLETADA: Testing y Lanzamiento - PROYECTO FINALIZADO
Some checks failed
CI/CD Pipeline / 🧪 Tests (push) Has been cancelled
CI/CD Pipeline / 🏗️ Build (push) Has been cancelled
CI/CD Pipeline / 🚀 Deploy to Staging (push) Has been cancelled
CI/CD Pipeline / 🚀 Deploy to Production (push) Has been cancelled
CI/CD Pipeline / 🏷️ Create Release (push) Has been cancelled
CI/CD Pipeline / 🧹 Cleanup (push) Has been cancelled

Implementados 4 módulos con agent swarm:

1. TESTING FUNCIONAL (Jest)
   - Configuración Jest + ts-jest
   - Tests unitarios: auth, booking, court (55 tests)
   - Tests integración: routes (56 tests)
   - Factories y utilidades de testing
   - Coverage configurado (70% servicios)
   - Scripts: test, test:watch, test:coverage

2. TESTING DE USUARIO (Beta)
   - Sistema de beta testers
   - Feedback con categorías y severidad
   - Beta issues tracking
   - 8 testers de prueba creados
   - API completa para gestión de feedback

3. DOCUMENTACIÓN COMPLETA
   - API.md - 150+ endpoints documentados
   - SETUP.md - Guía de instalación
   - DEPLOY.md - Deploy en VPS
   - ARCHITECTURE.md - Arquitectura del sistema
   - APP_STORE.md - Material para stores
   - Postman Collection completa
   - PM2 ecosystem config
   - Nginx config con SSL

4. GO LIVE Y PRODUCCIÓN
   - Sistema de monitoreo (logs, health checks)
   - Servicio de alertas multi-canal
   - Pre-deploy check script
   - Docker + docker-compose producción
   - Backup automatizado
   - CI/CD GitHub Actions
   - Launch checklist completo

ESTADÍSTICAS FINALES:
- Fases completadas: 7/7
- Archivos creados: 250+
- Líneas de código: 60,000+
- Endpoints API: 150+
- Tests: 110+
- Documentación: 5,000+ líneas

PROYECTO COMPLETO Y LISTO PARA PRODUCCIÓN
This commit is contained in:
2026-01-31 22:30:44 +00:00
parent e135e7ad24
commit dd10891432
61 changed files with 19256 additions and 142 deletions

363
backend/scripts/backup.sh Executable file
View File

@@ -0,0 +1,363 @@
#!/bin/bash
# =============================================================================
# Script de Backup para App Padel
# Fase 7.4 - Go Live y Soporte
# =============================================================================
#
# Este script realiza backup de:
# - Base de datos (PostgreSQL o SQLite)
# - Archivos de logs
# - Archivos subidos por usuarios (uploads)
#
# Los backups se comprimen y pueden subirse a S3 (AWS, MinIO, etc.)
#
# Uso:
# ./scripts/backup.sh
#
# Crontab (ejecutar diariamente a las 2 AM):
# 0 2 * * * /ruta/al/scripts/backup.sh >> /var/log/padel-backup.log 2>&1
# =============================================================================
set -euo pipefail
# -----------------------------------------------------------------------------
# Configuración
# -----------------------------------------------------------------------------
# Directorios
BACKUP_DIR="${BACKUP_DIR:-/backups}"
APP_DIR="${APP_DIR:-/app}"
LOGS_DIR="${APP_DIR}/logs"
UPLOADS_DIR="${APP_DIR}/uploads"
# Base de datos
DB_TYPE="${DB_TYPE:-postgresql}" # postgresql o sqlite
DB_HOST="${DB_HOST:-postgres}"
DB_PORT="${DB_PORT:-5432}"
DB_NAME="${DB_NAME:-padeldb}"
DB_USER="${DB_USER:-padeluser}"
DB_PASSWORD="${DB_PASSWORD:-}"
SQLITE_PATH="${SQLITE_PATH:-/app/prisma/dev.db}"
# Retención (días)
RETENTION_DAYS="${RETENTION_DAYS:-30}"
# Notificaciones
SLACK_WEBHOOK_URL="${SLACK_WEBHOOK_URL:-}"
EMAIL_TO="${BACKUP_EMAIL_TO:-}"
SMTP_HOST="${SMTP_HOST:-}"
SMTP_PORT="${SMTP_PORT:-587}"
SMTP_USER="${SMTP_USER:-}"
SMTP_PASS="${SMTP_PASS:-}"
# S3 (opcional)
S3_BUCKET="${BACKUP_S3_BUCKET:-}"
S3_REGION="${BACKUP_S3_REGION:-us-east-1}"
S3_ENDPOINT="${BACKUP_S3_ENDPOINT:-}" # Para MinIO u otros compatibles
AWS_ACCESS_KEY_ID="${AWS_ACCESS_KEY_ID:-}"
AWS_SECRET_ACCESS_KEY="${AWS_SECRET_ACCESS_KEY:-}"
# -----------------------------------------------------------------------------
# Variables internas
# -----------------------------------------------------------------------------
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
DATE=$(date +"%Y-%m-%d")
BACKUP_NAME="padel_backup_${TIMESTAMP}"
BACKUP_PATH="${BACKUP_DIR}/${BACKUP_NAME}"
LOG_FILE="${BACKUP_DIR}/backup_${TIMESTAMP}.log"
# Contadores
ERRORS=0
WARNINGS=0
# -----------------------------------------------------------------------------
# Funciones auxiliares
# -----------------------------------------------------------------------------
log() {
local level="$1"
shift
local message="$*"
local timestamp=$(date '+%Y-%m-%d %H:%M:%S')
echo "[${timestamp}] [${level}] ${message}" | tee -a "$LOG_FILE"
}
info() { log "INFO" "$@"; }
warn() { log "WARN" "$@"; ((WARNINGS++)); }
error() { log "ERROR" "$@"; ((ERRORS++)); }
send_notification() {
local status="$1"
local message="$2"
# Slack
if [[ -n "$SLACK_WEBHOOK_URL" ]]; then
local color="good"
[[ "$status" == "FAILED" ]] && color="danger"
[[ "$status" == "WARNING" ]] && color="warning"
curl -s -X POST "$SLACK_WEBHOOK_URL" \
-H 'Content-type: application/json' \
--data "{
\"attachments\": [{
\"color\": \"${color}\",
\"title\": \"Padel Backup - ${status}\",
\"text\": \"${message}\",
\"footer\": \"Padel App\",
\"ts\": $(date +%s)
}]
}" || warn "No se pudo enviar notificación a Slack"
fi
# Email (usando sendmail o similar)
if [[ -n "$EMAIL_TO" && -n "$SMTP_HOST" ]]; then
local subject="[Padel Backup] ${status} - ${DATE}"
{
echo "Subject: ${subject}"
echo "To: ${EMAIL_TO}"
echo "Content-Type: text/plain; charset=UTF-8"
echo ""
echo "$message"
echo ""
echo "---"
echo "Timestamp: $(date)"
echo "Hostname: $(hostname)"
echo "Backup: ${BACKUP_NAME}"
} | sendmail "$EMAIL_TO" || warn "No se pudo enviar email"
fi
}
cleanup() {
local exit_code=$?
if [[ $exit_code -ne 0 ]]; then
error "Script terminado con errores (código: $exit_code)"
send_notification "FAILED" "El backup falló. Ver log: ${LOG_FILE}"
# Limpiar archivos temporales
if [[ -d "$BACKUP_PATH" ]]; then
rm -rf "$BACKUP_PATH"
fi
fi
exit $exit_code
}
trap cleanup EXIT
# -----------------------------------------------------------------------------
# Preparación
# -----------------------------------------------------------------------------
info "Iniciando backup: ${BACKUP_NAME}"
info "Directorio de backup: ${BACKUP_DIR}"
# Crear directorio de backup
mkdir -p "$BACKUP_DIR"
mkdir -p "$BACKUP_PATH"
# -----------------------------------------------------------------------------
# Backup de Base de Datos
# -----------------------------------------------------------------------------
info "Realizando backup de base de datos (${DB_TYPE})..."
if [[ "$DB_TYPE" == "postgresql" ]]; then
if command -v pg_dump &> /dev/null; then
export PGPASSWORD="$DB_PASSWORD"
if pg_dump -h "$DB_HOST" -p "$DB_PORT" -U "$DB_USER" -d "$DB_NAME" \
--verbose --no-owner --no-acl \
-f "${BACKUP_PATH}/database.sql" 2>> "$LOG_FILE"; then
info "Backup PostgreSQL completado: database.sql"
else
error "Fallo al hacer backup de PostgreSQL"
fi
unset PGPASSWORD
else
error "pg_dump no encontrado"
fi
elif [[ "$DB_TYPE" == "sqlite" ]]; then
if [[ -f "$SQLITE_PATH" ]]; then
# SQLite: simplemente copiar el archivo (asegurando integridad)
if sqlite3 "$SQLITE_PATH" ".backup '${BACKUP_PATH}/database.db'" 2>> "$LOG_FILE"; then
info "Backup SQLite completado: database.db"
else
error "Fallo al hacer backup de SQLite"
fi
else
error "Archivo SQLite no encontrado: $SQLITE_PATH"
fi
else
error "Tipo de base de datos no soportado: $DB_TYPE"
fi
# -----------------------------------------------------------------------------
# Backup de Logs
# -----------------------------------------------------------------------------
info "Realizando backup de logs..."
if [[ -d "$LOGS_DIR" ]]; then
if tar -czf "${BACKUP_PATH}/logs.tar.gz" -C "$(dirname "$LOGS_DIR")" "$(basename "$LOGS_DIR")" 2>> "$LOG_FILE"; then
info "Backup de logs completado: logs.tar.gz"
else
warn "Fallo al comprimir logs (puede que no existan)"
fi
else
warn "Directorio de logs no encontrado: $LOGS_DIR"
fi
# -----------------------------------------------------------------------------
# Backup de Uploads
# -----------------------------------------------------------------------------
info "Realizando backup de uploads..."
if [[ -d "$UPLOADS_DIR" ]]; then
if tar -czf "${BACKUP_PATH}/uploads.tar.gz" -C "$(dirname "$UPLOADS_DIR")" "$(basename "$UPLOADS_DIR")" 2>> "$LOG_FILE"; then
info "Backup de uploads completado: uploads.tar.gz"
else
warn "Fallo al comprimir uploads"
fi
else
warn "Directorio de uploads no encontrado: $UPLOADS_DIR"
fi
# -----------------------------------------------------------------------------
# Crear manifest
# -----------------------------------------------------------------------------
cat > "${BACKUP_PATH}/manifest.json" << EOF
{
"backup_name": "${BACKUP_NAME}",
"timestamp": "$(date -Iseconds)",
"hostname": "$(hostname)",
"version": "1.0.0",
"database": {
"type": "${DB_TYPE}",
"name": "${DB_NAME}"
},
"files": [
$(ls -1 "${BACKUP_PATH}" | grep -E '\.(sql|db|tar\.gz)$' | sed 's/^/ "/;s/$/"/' | paste -sd ',' -)
]
}
EOF
info "Manifest creado"
# -----------------------------------------------------------------------------
# Comprimir backup completo
# -----------------------------------------------------------------------------
info "Comprimiendo backup completo..."
cd "$BACKUP_DIR"
if tar -czf "${BACKUP_NAME}.tar.gz" "$BACKUP_NAME"; then
info "Backup comprimido: ${BACKUP_NAME}.tar.gz"
# Calcular tamaño
BACKUP_SIZE=$(du -h "${BACKUP_NAME}.tar.gz" | cut -f1)
info "Tamaño del backup: ${BACKUP_SIZE}"
# Eliminar directorio temporal
rm -rf "$BACKUP_PATH"
else
error "Fallo al comprimir backup"
exit 1
fi
# -----------------------------------------------------------------------------
# Subir a S3 (opcional)
# -----------------------------------------------------------------------------
if [[ -n "$S3_BUCKET" && -n "$AWS_ACCESS_KEY_ID" ]]; then
info "Subiendo backup a S3..."
# Configurar AWS CLI si es necesario
if [[ -n "$S3_ENDPOINT" ]]; then
export AWS_ENDPOINT_URL="$S3_ENDPOINT"
fi
if command -v aws &> /dev/null; then
if aws s3 cp "${BACKUP_NAME}.tar.gz" "s3://${S3_BUCKET}/backups/" \
--region "$S3_REGION" 2>> "$LOG_FILE"; then
info "Backup subido a S3: s3://${S3_BUCKET}/backups/${BACKUP_NAME}.tar.gz"
else
error "Fallo al subir backup a S3"
fi
else
warn "AWS CLI no instalado, no se pudo subir a S3"
fi
fi
# -----------------------------------------------------------------------------
# Limpiar backups antiguos
# -----------------------------------------------------------------------------
info "Limpiando backups antiguos (retención: ${RETENTION_DAYS} días)..."
# Limpiar backups locales
find "$BACKUP_DIR" -name "padel_backup_*.tar.gz" -type f -mtime +$RETENTION_DAYS -delete 2>/dev/null || true
find "$BACKUP_DIR" -name "backup_*.log" -type f -mtime +$RETENTION_DAYS -delete 2>/dev/null || true
info "Limpieza completada"
# Limpiar backups en S3 (si está configurado)
if [[ -n "$S3_BUCKET" && -n "$AWS_ACCESS_KEY_ID" ]] && command -v aws &> /dev/null; then
info "Limpiando backups antiguos en S3..."
# Listar y eliminar backups antiguos
aws s3 ls "s3://${S3_BUCKET}/backups/" --region "$S3_REGION" | \
while read -r line; do
file_date=$(echo "$line" | awk '{print $1}')
file_name=$(echo "$line" | awk '{print $4}')
# Calcular días desde la fecha del archivo
file_timestamp=$(date -d "$file_date" +%s 2>/dev/null || echo 0)
current_timestamp=$(date +%s)
days_old=$(( (current_timestamp - file_timestamp) / 86400 ))
if [[ $days_old -gt $RETENTION_DAYS ]]; then
aws s3 rm "s3://${S3_BUCKET}/backups/${file_name}" --region "$S3_REGION" 2>/dev/null || true
info "Eliminado backup antiguo de S3: $file_name"
fi
done
fi
# -----------------------------------------------------------------------------
# Resumen y notificación
# -----------------------------------------------------------------------------
info "Backup completado: ${BACKUP_NAME}.tar.gz"
info "Tamaño: ${BACKUP_SIZE}"
info "Errores: ${ERRORS}"
info "Advertencias: ${WARNINGS}"
# Preparar mensaje de resumen
SUMMARY="Backup completado exitosamente.
Nombre: ${BACKUP_NAME}
Fecha: ${DATE}
Tamaño: ${BACKUP_SIZE}
Errores: ${ERRORS}
Advertencias: ${WARNINGS}
Archivos incluidos:
- Base de datos (${DB_TYPE})
- Logs
- Uploads
Ubicación: ${BACKUP_DIR}/${BACKUP_NAME}.tar.gz"
if [[ $ERRORS -eq 0 ]]; then
send_notification "SUCCESS" "$SUMMARY"
info "✅ Backup finalizado correctamente"
else
send_notification "WARNING" "Backup completado con ${ERRORS} errores. Ver log para detalles."
warn "⚠️ Backup completado con errores"
fi
exit $ERRORS