Practice 5 — Database Integration

Express + MySQL Integration for Review Aggregator

Practice
Database
MySQL
Express
Integration
Auteur
Affiliations

Université de Toulon

LIS UMR CNRS 7020

Date de publication

2025-10-12

Résumé

Course lectures and practices for JavaScript full‑stack web development with AI‑assisted workflows.

Practice Overview

Duration: 60 minutes
Prerequisites: Session 4 Express API completed, MySQL knowledge
Objective: Replace in-memory arrays with MySQL database persistence

Core Learning Goals (60 minutes)

By the end of this practice, you will be able to:

  • Integrate MySQL with Express using connection pools
  • Replace in-memory arrays with database queries
  • Handle database errors gracefully in API endpoints
  • Implement transactions for data consistency
  • Optimize queries with proper indexing
  • Secure database operations against SQL injection

Prerequisite Check: You should be comfortable with: - Basic SQL (SELECT, INSERT, UPDATE, DELETE) - JOIN operations - Database design concepts - Express.js route handlers (from Session 4)


Part 1: Database Setup (10 min)

Exercise 1.1: Use Standardized Database Schema

Prerequisites: You should already have the database from Session 4, but if not:

# Run the standardized schema
mysql -u root -p < database-schema.sql

# Or connect and run manually
mysql -u root -p
# Then copy/paste content from database-schema.sql
Database Consistency

Important: Session 5 uses the same database schema as Session 4.

File reference: database-schema.sql (course root directory)

If you’re continuing from Session 4: Your database should already be set up correctly.

If you’re starting fresh: Run the standardized schema file first.

Verify your setup:

-- Connect to verify
mysql -u root -p

-- Check database and tables
USE review_aggregator;
SHOW TABLES;
-- Should show: products, reviews

-- Verify sample data exists  
SELECT COUNT(*) FROM products;
SELECT COUNT(*) FROM reviews;
  
  -- Constraints
  UNIQUE KEY unique_review (product_id, source, external_review_id),
  FOREIGN KEY (product_id) REFERENCES products(id) ON DELETE CASCADE,
  
  -- Performance indexes
  INDEX idx_product_id (product_id),
  INDEX idx_source (source),
  INDEX idx_rating (rating),
  INDEX idx_review_date (review_date)
);

-- Insert sample products
INSERT INTO products (name, description, price) VALUES
('Wireless Bluetooth Headphones', 'Premium noise-canceling headphones with 30-hour battery', 199.99),
('USB-C Fast Charger', '65W fast charger compatible with laptops and phones', 49.99),
('Mechanical Gaming Keyboard', 'RGB backlit mechanical keyboard with blue switches', 129.99);

Exercise 1.2: Connection Pool Setup

Create src/config/database.js:

const mysql = require('mysql2/promise');

// Create connection pool
const pool = mysql.createPool({
  host: process.env.DB_HOST || '127.0.0.1',
  port: process.env.DB_PORT || 3306,
  user: process.env.DB_USER || 'root',
  password: process.env.DB_PASSWORD || 'your_password_here',
  database: process.env.DB_NAME || 'review_aggregator',
  
  // Pool configuration
  waitForConnections: true,
  connectionLimit: 10,
  queueLimit: 0,
  
  // Connection settings
  acquireTimeout: 60000,
  timeout: 60000,
  reconnect: true
});

// Test connection on startup
async function testConnection() {
  try {
    const [rows] = await pool.execute('SELECT 1 as test');
    console.log('✅ Database connected successfully');
    return true;
  } catch (error) {
    console.error('❌ Database connection failed:', error.message);
    return false;
  }
}

module.exports = { pool, testConnection };

/*
Instrumentation helper (optional but recommended):
Add a small wrapper to expose safe pool statistics. mysql2's pool does not
expose `totalConnections`/`freeConnections` consistently across versions,
so we track open/acquired connections explicitly if you need runtime stats.

Example (append to src/config/database.js):

// ... existing pool creation above ...
let activeConnections = 0;

const originalGetConnection = pool.getConnection.bind(pool);
pool.getConnection = async function(...args) {
  const conn = await originalGetConnection(...args);
  activeConnections += 1;

  const originalRelease = conn.release.bind(conn);
  conn.release = function() {
    activeConnections = Math.max(0, activeConnections - 1);
    return originalRelease();
  };

  return conn;
};

function getPoolStats() {
  return {
    connectionLimit: pool.config.connectionLimit,
    activeConnections,
    // Note: pool._allConnections and other internals are not portable
    // across mysql2 versions; prefer to track what you need explicitly.
  };
}

module.exports = { pool, testConnection, getPoolStats };

*/
// Note: If you are using multiple statements in your migrations, ensure that
// the `multipleStatements` option is set to true in your pool configuration.
// This allows executing multiple SQL statements in a single query.

// Example:
const pool = mysql.createPool({
  // ... other configurations ...
  multipleStatements: true,
});

// Be cautious with this option as it can expose your application to SQL injection
// if not handled properly. Always validate and sanitize your inputs.

Test: Run npm run dev and verify “✅ Database connected successfully” appears.


Part 2: Replace GET Endpoints (15 min)

Exercise 2.1: Replace GET /api/reviews

Update src/routes/reviews.js:

const express = require('express');
const { pool } = require('../config/database');
const router = express.Router();

// GET /api/reviews - Get reviews with filtering
router.get('/', async (req, res) => {
  try {
    const {
      product_id,
      source,
      min_rating,
      limit = 20,
      offset = 0
    } = req.query;
    
    // Build dynamic WHERE clause
    let sql = 'SELECT * FROM reviews WHERE product_id = ?';
    let params = [product_id];
    
    if (source) {
      sql += ' AND source = ?';
      params.push(source);
    }
    
    if (min_rating) {
      sql += ' AND rating >= ?';
      params.push(min_rating);
    }
    
    sql += ' ORDER BY review_date DESC LIMIT ? OFFSET ?';
    params.push(parseInt(limit), parseInt(offset));
    
    const [rows] = await pool.execute(sql, params);
    
    // Get total count for pagination
    let countSql = 'SELECT COUNT(*) as total FROM reviews WHERE product_id = ?';
    let countParams = [product_id];
    
    if (source) {
      countSql += ' AND source = ?';
      countParams.push(source);
    }
    
    if (min_rating) {
      countSql += ' AND rating >= ?';
      countParams.push(min_rating);
    }
    
    const [countRows] = await pool.execute(countSql, countParams);
    
    res.json({
      success: true,
      data: rows,
      meta: {
        total: countRows[0].total,
        limit: parseInt(limit),
        offset: parseInt(offset)
      }
    });
  } catch (error) {
    console.error('Database error:', error);
    res.status(500).json({
      success: false,
      error: 'Failed to fetch reviews'
    });
  }
});

module.exports = router;

Exercise 2.2: Add Aggregate Statistics Endpoint

// GET /api/reviews/aggregate - Get aggregated statistics
router.get('/aggregate', async (req, res) => {
  try {
    const { product_id } = req.query;
    
    // Overall statistics
    const [overall] = await pool.execute(
      `SELECT 
        COALESCE(AVG(rating), 0) as average_rating,
        COUNT(*) as total_reviews,
        MIN(rating) as min_rating,
        MAX(rating) as max_rating
       FROM reviews 
       WHERE product_id = ?`,
      [product_id]
    );
    
    // Per-source breakdown
    const [sources] = await pool.execute(
      `SELECT 
        source,
        AVG(rating) as average_rating,
        COUNT(*) as review_count
       FROM reviews 
       WHERE product_id = ?
       GROUP BY source
       ORDER BY review_count DESC`,
      [product_id]
    );
    
    // Rating distribution histogram
    const [distribution] = await pool.execute(
      `SELECT 
        rating,
        COUNT(*) as count
       FROM reviews 
       WHERE product_id = ?
       GROUP BY rating
       ORDER BY rating DESC`,
      [product_id]
    );
    
    // Transform distribution to object
    const ratingHistogram = { "5": 0, "4": 0, "3": 0, "2": 0, "1": 0 };
    distribution.forEach(row => {
      ratingHistogram[row.rating.toString()] = row.count;
    });
    
    res.json({
      success: true,
      data: {
        overall: {
          average_rating: overall[0].average_rating 
            ? Math.round(overall[0].average_rating * 10) / 10 
            : 0,
          total_reviews: overall[0].total_reviews,
          min_rating: overall[0].min_rating || 0,
          max_rating: overall[0].max_rating || 0
        },
        by_source: sources.map(row => ({
          source: row.source,
          average_rating: Math.round(row.average_rating * 10) / 10,
          review_count: row.review_count
        })),
        rating_histogram: ratingHistogram
      }
    });
    
  } catch (error) {
    console.error('Aggregate query error:', error);
    res.status(500).json({
      success: false,
      error: 'Failed to calculate statistics'
    });
  }
});

Part 3: Implement POST and Transaction Endpoints (20 min)

Exercise 3.1: Add Review Creation Endpoint

// POST /api/reviews - Create new review
router.post('/', async (req, res) => {
  try {
    const {
      product_id,
      source,
      external_review_id,
      author,
      rating,
      title,
      content,
      review_date
    } = req.body;
    
    const [result] = await pool.execute(
      `INSERT INTO reviews 
       (product_id, source, external_review_id, author, rating, title, content, review_date)
       VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
      [
        product_id,
        source,
        external_review_id,
        author || null,
        rating,
        title || null,
        content || null,
        review_date || new Date()
      ]
    );
    
    // Fetch the created review
    const [rows] = await pool.execute(
      'SELECT * FROM reviews WHERE id = ?',
      [result.insertId]
    );
    
    res.status(201).json({
      success: true,
      data: rows[0]
    });
    
  } catch (error) {
    // Handle specific MySQL errors
    if (error.code === 'ER_DUP_ENTRY') {
      return res.status(409).json({
        success: false,
        error: 'Review already exists',
        code: 'DUPLICATE_REVIEW'
      });
    }
    
    if (error.code === 'ER_NO_REFERENCED_ROW_2') {
      return res.status(400).json({
        success: false,
        error: 'Product not found',
        code: 'INVALID_PRODUCT'
      });
    }
    
    console.error('Database error:', error);
    res.status(500).json({
      success: false,
      error: 'Failed to create review'
    });
  }
});

Exercise 3.2: Implement Batch Review Insert with Transaction

// POST /api/reviews/batch - Create multiple reviews in transaction
router.post('/batch', async (req, res) => {
  const connection = await pool.getConnection();
  
  try {
    await connection.beginTransaction();
    
    const insertedIds = [];
    
    for (const review of req.body.reviews) {
      const [result] = await connection.execute(
        `INSERT INTO reviews 
         (product_id, source, external_review_id, author, rating, title, content, review_date)
         VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
        [
          review.product_id,
          review.source,
          review.external_review_id,
          review.author || null,
          review.rating,
          review.title || null,
          review.content || null,
          review.review_date || new Date()
        ]
      );
      insertedIds.push(result.insertId);
    }
    
    await connection.commit();
    
    // Fetch all inserted reviews
    if (insertedIds.length > 0) {
      const placeholders = insertedIds.map(() => '?').join(',');
      const [rows] = await pool.execute(
        `SELECT * FROM reviews WHERE id IN (${placeholders})`,
        insertedIds
      );
      
      res.status(201).json({
        success: true,
        message: `Created ${insertedIds.length} reviews`,
        data: rows
      });
    }
    
  } catch (error) {
    await connection.rollback();
    
    if (error.code === 'ER_DUP_ENTRY') {
      return res.status(409).json({
        success: false,
        error: 'Duplicate review in batch',
        code: 'BATCH_DUPLICATE'
      });
    }
    
    console.error('Batch insert error:', error);
    res.status(500).json({
      success: false,
      error: 'Failed to create reviews batch'
    });
  } finally {
    connection.release();
  }
});

Part 4: Testing & Health Checks (10 min)

Exercise 4.1: Test Complete CRUD Operations

Test creating reviews:

# Create a review
curl -X POST http://localhost:4000/api/reviews \
  -H "Content-Type: application/json" \
  -d '{
    "product_id": 1,
    "source": "Amazon",
    "external_review_id": "AMZ001",
    "author": "John Doe",
    "rating": 5,
    "title": "Excellent headphones!",
    "content": "Great sound quality and battery life.",
    "review_date": "2025-09-15T10:30:00Z"
  }'

# Test batch insert
curl -X POST http://localhost:4000/api/reviews/batch \
  -H "Content-Type: application/json" \
  -d '{
    "reviews": [
      {
        "product_id": 1,
        "source": "BestBuy",
        "external_review_id": "BB001",
        "author": "Jane Smith",
        "rating": 4,
        "title": "Good value",
        "content": "Nice quality for the price."
      },
      {
        "product_id": 1,
        "source": "Walmart",
        "external_review_id": "WM001",
        "author": "Mike Wilson",
        "rating": 5,
        "title": "Love them!",
        "content": "Best headphones I have owned."
      }
    ]
  }'

Exercise 4.2: Add Database Health Check

Create src/routes/health.js:

const express = require('express');
const { pool } = require('../config/database');
const router = express.Router();

router.get('/database', async (req, res) => {
  try {
    const start = Date.now();
    const [rows] = await pool.execute('SELECT COUNT(*) as review_count FROM reviews');
    const queryTime = Date.now() - start;
    
    res.json({
      status: 'healthy',
      database: 'connected',
      response_time_ms: queryTime,
      total_reviews: rows[0].review_count,
      pool_status: {
        total_connections: pool.pool.totalConnections,
        free_connections: pool.pool.freeConnections,
        used_connections: pool.pool.allConnections - pool.pool.freeConnections
      },
      timestamp: new Date().toISOString()
    });
    
  } catch (error) {
    res.status(503).json({
      status: 'unhealthy',
      database: 'disconnected',
      error: error.message,
      timestamp: new Date().toISOString()
    });
  }
});

module.exports = router;

Exercise 4.3: Final Verification

Test all endpoints work:

# Health check
curl http://localhost:4000/health/database

# Get reviews
curl "http://localhost:4000/api/reviews?product_id=1"

# Get aggregate stats
curl "http://localhost:4000/api/reviews/aggregate?product_id=1"

Part 5: Optional Extensions (5 min)

Extension 1: Query Performance Monitoring

// Add query timing to all database operations
const originalExecute = pool.execute;
pool.execute = async function(sql, params) {
  const start = Date.now();
  try {
    const result = await originalExecute.call(this, sql, params);
    const duration = Date.now() - start;
    if (duration > 1000) {
      console.warn(`🐌 Slow query (${duration}ms):`, sql.substring(0, 50) + '...');
    }
    return result;
  } catch (error) {
    console.error(`❌ Query failed (${Date.now() - start}ms):`, error.message);
    throw error;
  }
};

Extension 2: Migration System

// scripts/migrate.js
const fs = require('fs');
const path = require('path');
const { pool } = require('../src/config/database');

async function runMigrations() {
  // Create migrations table
  await pool.execute(`
    CREATE TABLE IF NOT EXISTS migrations (
      id INT AUTO_INCREMENT PRIMARY KEY,
      filename VARCHAR(255) NOT NULL UNIQUE,
      executed_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
    )
  `);
  
  // Run pending migrations
  const migrationsDir = path.join(__dirname, '../migrations');
  const files = fs.readdirSync(migrationsDir).filter(f => f.endsWith('.sql')).sort();
  
  for (const file of files) {
    const [existing] = await pool.execute('SELECT filename FROM migrations WHERE filename = ?', [file]);
    if (existing.length > 0) continue;
    
    console.log(`Running migration: ${file}`);
    const sql = fs.readFileSync(path.join(migrationsDir, file), 'utf8');
    await pool.execute(sql);
    
    await pool.execute('INSERT INTO migrations (filename) VALUES (?)', [file]);
    console.log(`✅ Completed: ${file}`);
  }
  
  console.log('🎉 All migrations completed!');
  process.exit(0);
}

runMigrations().catch(error => {
  console.error('❌ Migration failed:', error);
  process.exit(1);
});

Success Criteria & Assessment

Core Requirements ✅

Testing Checklist ✅


Troubleshooting

Common Issues

Problem: ER_ACCESS_DENIED_ERROR

# Check credentials in .env
DB_USER=root
DB_PASSWORD=password
DB_HOST=127.0.0.1

Problem: ER_NO_SUCH_TABLE

# Run schema migration
mysql -h 127.0.0.1 -P 3306 -u root -ppassword review_aggregator < schema.sql

Problem: ECONNREFUSED

# Check MySQL container
docker ps | grep mysql
# Restart if needed
docker-compose up -d mysql

Resources


🎉 Practice Complete!

Congratulations! You’ve successfully:

  • ✅ Integrated MySQL with Express using connection pools
  • ✅ Replaced in-memory arrays with persistent database storage
  • ✅ Implemented parameterized queries for SQL injection prevention
  • ✅ Added transaction support for data consistency
  • ✅ Created aggregate statistics calculations
  • ✅ Built proper error handling for database operations

You’re ready for full-stack integration! 🚀

Réutilisation