Practice 5 — Database Integration
Express + MySQL Integration for Review Aggregator
Course lectures and practices for JavaScript full‑stack web development with AI‑assisted workflows.
Practice Overview
Duration: 60 minutes
Prerequisites: Session 4 Express API completed, MySQL knowledge
Objective: Replace in-memory arrays with MySQL database persistence
By the end of this practice, you will be able to:
- ✅ Integrate MySQL with Express using connection pools
- ✅ Replace in-memory arrays with database queries
- ✅ Handle database errors gracefully in API endpoints
- ✅ Implement transactions for data consistency
- ✅ Optimize queries with proper indexing
- ✅ Secure database operations against SQL injection
Prerequisite Check: You should be comfortable with: - Basic SQL (SELECT, INSERT, UPDATE, DELETE) - JOIN operations - Database design concepts - Express.js route handlers (from Session 4)
Part 1: Database Setup (10 min)
Exercise 1.1: Use Standardized Database Schema
Prerequisites: You should already have the database from Session 4, but if not:
# Run the standardized schema
mysql -u root -p < database-schema.sql
# Or connect and run manually
mysql -u root -p
# Then copy/paste content from database-schema.sql
Important: Session 5 uses the same database schema as Session 4.
File reference: database-schema.sql
(course root directory)
If you’re continuing from Session 4: Your database should already be set up correctly.
If you’re starting fresh: Run the standardized schema file first.
Verify your setup:
-- Connect to verify
-u root -p
mysql
-- Check database and tables
USE review_aggregator;
TABLES;
SHOW -- Should show: products, reviews
-- Verify sample data exists
SELECT COUNT(*) FROM products;
SELECT COUNT(*) FROM reviews;
-- Constraints
UNIQUE KEY unique_review (product_id, source, external_review_id),
FOREIGN KEY (product_id) REFERENCES products(id) ON DELETE CASCADE,
-- Performance indexes
INDEX idx_product_id (product_id),
INDEX idx_source (source),
INDEX idx_rating (rating),
INDEX idx_review_date (review_date)
);
-- Insert sample products
INSERT INTO products (name, description, price) VALUES
'Wireless Bluetooth Headphones', 'Premium noise-canceling headphones with 30-hour battery', 199.99),
('USB-C Fast Charger', '65W fast charger compatible with laptops and phones', 49.99),
('Mechanical Gaming Keyboard', 'RGB backlit mechanical keyboard with blue switches', 129.99); (
Exercise 1.2: Connection Pool Setup
Create src/config/database.js
:
const mysql = require('mysql2/promise');
// Create connection pool
const pool = mysql.createPool({
host: process.env.DB_HOST || '127.0.0.1',
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER || 'root',
password: process.env.DB_PASSWORD || 'your_password_here',
database: process.env.DB_NAME || 'review_aggregator',
// Pool configuration
waitForConnections: true,
connectionLimit: 10,
queueLimit: 0,
// Connection settings
acquireTimeout: 60000,
timeout: 60000,
reconnect: true
;
})
// Test connection on startup
async function testConnection() {
try {
const [rows] = await pool.execute('SELECT 1 as test');
console.log('✅ Database connected successfully');
return true;
catch (error) {
} console.error('❌ Database connection failed:', error.message);
return false;
}
}
.exports = { pool, testConnection };
module
/*
Instrumentation helper (optional but recommended):
Add a small wrapper to expose safe pool statistics. mysql2's pool does not
expose `totalConnections`/`freeConnections` consistently across versions,
so we track open/acquired connections explicitly if you need runtime stats.
Example (append to src/config/database.js):
// ... existing pool creation above ...
let activeConnections = 0;
const originalGetConnection = pool.getConnection.bind(pool);
pool.getConnection = async function(...args) {
const conn = await originalGetConnection(...args);
activeConnections += 1;
const originalRelease = conn.release.bind(conn);
conn.release = function() {
activeConnections = Math.max(0, activeConnections - 1);
return originalRelease();
};
return conn;
};
function getPoolStats() {
return {
connectionLimit: pool.config.connectionLimit,
activeConnections,
// Note: pool._allConnections and other internals are not portable
// across mysql2 versions; prefer to track what you need explicitly.
};
}
module.exports = { pool, testConnection, getPoolStats };
*/
// Note: If you are using multiple statements in your migrations, ensure that
// the `multipleStatements` option is set to true in your pool configuration.
// This allows executing multiple SQL statements in a single query.
// Example:
const pool = mysql.createPool({
// ... other configurations ...
multipleStatements: true,
;
})
// Be cautious with this option as it can expose your application to SQL injection
// if not handled properly. Always validate and sanitize your inputs.
Test: Run npm run dev
and verify “✅ Database connected successfully” appears.
Part 2: Replace GET Endpoints (15 min)
Exercise 2.1: Replace GET /api/reviews
Update src/routes/reviews.js
:
const express = require('express');
const { pool } = require('../config/database');
const router = express.Router();
// GET /api/reviews - Get reviews with filtering
.get('/', async (req, res) => {
routertry {
const {
,
product_id,
source,
min_rating= 20,
limit = 0
offset = req.query;
}
// Build dynamic WHERE clause
let sql = 'SELECT * FROM reviews WHERE product_id = ?';
let params = [product_id];
if (source) {
+= ' AND source = ?';
sql .push(source);
params
}
if (min_rating) {
+= ' AND rating >= ?';
sql .push(min_rating);
params
}
+= ' ORDER BY review_date DESC LIMIT ? OFFSET ?';
sql .push(parseInt(limit), parseInt(offset));
params
const [rows] = await pool.execute(sql, params);
// Get total count for pagination
let countSql = 'SELECT COUNT(*) as total FROM reviews WHERE product_id = ?';
let countParams = [product_id];
if (source) {
+= ' AND source = ?';
countSql .push(source);
countParams
}
if (min_rating) {
+= ' AND rating >= ?';
countSql .push(min_rating);
countParams
}
const [countRows] = await pool.execute(countSql, countParams);
.json({
ressuccess: true,
data: rows,
meta: {
total: countRows[0].total,
limit: parseInt(limit),
offset: parseInt(offset)
};
})catch (error) {
} console.error('Database error:', error);
.status(500).json({
ressuccess: false,
error: 'Failed to fetch reviews'
;
})
};
})
.exports = router; module
Exercise 2.2: Add Aggregate Statistics Endpoint
// GET /api/reviews/aggregate - Get aggregated statistics
.get('/aggregate', async (req, res) => {
routertry {
const { product_id } = req.query;
// Overall statistics
const [overall] = await pool.execute(
`SELECT
COALESCE(AVG(rating), 0) as average_rating,
COUNT(*) as total_reviews,
MIN(rating) as min_rating,
MAX(rating) as max_rating
FROM reviews
WHERE product_id = ?`,
[product_id];
)
// Per-source breakdown
const [sources] = await pool.execute(
`SELECT
source,
AVG(rating) as average_rating,
COUNT(*) as review_count
FROM reviews
WHERE product_id = ?
GROUP BY source
ORDER BY review_count DESC`,
[product_id];
)
// Rating distribution histogram
const [distribution] = await pool.execute(
`SELECT
rating,
COUNT(*) as count
FROM reviews
WHERE product_id = ?
GROUP BY rating
ORDER BY rating DESC`,
[product_id];
)
// Transform distribution to object
const ratingHistogram = { "5": 0, "4": 0, "3": 0, "2": 0, "1": 0 };
.forEach(row => {
distribution.rating.toString()] = row.count;
ratingHistogram[row;
})
.json({
ressuccess: true,
data: {
overall: {
average_rating: overall[0].average_rating
? Math.round(overall[0].average_rating * 10) / 10
: 0,
total_reviews: overall[0].total_reviews,
min_rating: overall[0].min_rating || 0,
max_rating: overall[0].max_rating || 0
,
}by_source: sources.map(row => ({
source: row.source,
average_rating: Math.round(row.average_rating * 10) / 10,
review_count: row.review_count
,
}))rating_histogram: ratingHistogram
};
})
catch (error) {
} console.error('Aggregate query error:', error);
.status(500).json({
ressuccess: false,
error: 'Failed to calculate statistics'
;
})
}; })
Part 3: Implement POST and Transaction Endpoints (20 min)
Exercise 3.1: Add Review Creation Endpoint
// POST /api/reviews - Create new review
.post('/', async (req, res) => {
routertry {
const {
,
product_id,
source,
external_review_id,
author,
rating,
title,
content
review_date= req.body;
}
const [result] = await pool.execute(
`INSERT INTO reviews
(product_id, source, external_review_id, author, rating, title, content, review_date)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
[,
product_id,
source,
external_review_id|| null,
author ,
rating|| null,
title || null,
content || new Date()
review_date
];
)
// Fetch the created review
const [rows] = await pool.execute(
'SELECT * FROM reviews WHERE id = ?',
.insertId]
[result;
)
.status(201).json({
ressuccess: true,
data: rows[0]
;
})
catch (error) {
} // Handle specific MySQL errors
if (error.code === 'ER_DUP_ENTRY') {
return res.status(409).json({
success: false,
error: 'Review already exists',
code: 'DUPLICATE_REVIEW'
;
})
}
if (error.code === 'ER_NO_REFERENCED_ROW_2') {
return res.status(400).json({
success: false,
error: 'Product not found',
code: 'INVALID_PRODUCT'
;
})
}
console.error('Database error:', error);
.status(500).json({
ressuccess: false,
error: 'Failed to create review'
;
})
}; })
Exercise 3.2: Implement Batch Review Insert with Transaction
// POST /api/reviews/batch - Create multiple reviews in transaction
.post('/batch', async (req, res) => {
routerconst connection = await pool.getConnection();
try {
await connection.beginTransaction();
const insertedIds = [];
for (const review of req.body.reviews) {
const [result] = await connection.execute(
`INSERT INTO reviews
(product_id, source, external_review_id, author, rating, title, content, review_date)
VALUES (?, ?, ?, ?, ?, ?, ?, ?)`,
[.product_id,
review.source,
review.external_review_id,
review.author || null,
review.rating,
review.title || null,
review.content || null,
review.review_date || new Date()
review
];
).push(result.insertId);
insertedIds
}
await connection.commit();
// Fetch all inserted reviews
if (insertedIds.length > 0) {
const placeholders = insertedIds.map(() => '?').join(',');
const [rows] = await pool.execute(
`SELECT * FROM reviews WHERE id IN (${placeholders})`,
insertedIds;
)
.status(201).json({
ressuccess: true,
message: `Created ${insertedIds.length} reviews`,
data: rows
;
})
}
catch (error) {
} await connection.rollback();
if (error.code === 'ER_DUP_ENTRY') {
return res.status(409).json({
success: false,
error: 'Duplicate review in batch',
code: 'BATCH_DUPLICATE'
;
})
}
console.error('Batch insert error:', error);
.status(500).json({
ressuccess: false,
error: 'Failed to create reviews batch'
;
})finally {
} .release();
connection
}; })
Part 4: Testing & Health Checks (10 min)
Exercise 4.1: Test Complete CRUD Operations
Test creating reviews:
# Create a review
curl -X POST http://localhost:4000/api/reviews \
-H "Content-Type: application/json" \
-d '{
"product_id": 1,
"source": "Amazon",
"external_review_id": "AMZ001",
"author": "John Doe",
"rating": 5,
"title": "Excellent headphones!",
"content": "Great sound quality and battery life.",
"review_date": "2025-09-15T10:30:00Z"
}'
# Test batch insert
curl -X POST http://localhost:4000/api/reviews/batch \
-H "Content-Type: application/json" \
-d '{
"reviews": [
{
"product_id": 1,
"source": "BestBuy",
"external_review_id": "BB001",
"author": "Jane Smith",
"rating": 4,
"title": "Good value",
"content": "Nice quality for the price."
},
{
"product_id": 1,
"source": "Walmart",
"external_review_id": "WM001",
"author": "Mike Wilson",
"rating": 5,
"title": "Love them!",
"content": "Best headphones I have owned."
}
]
}'
Exercise 4.2: Add Database Health Check
Create src/routes/health.js
:
const express = require('express');
const { pool } = require('../config/database');
const router = express.Router();
.get('/database', async (req, res) => {
routertry {
const start = Date.now();
const [rows] = await pool.execute('SELECT COUNT(*) as review_count FROM reviews');
const queryTime = Date.now() - start;
.json({
resstatus: 'healthy',
database: 'connected',
response_time_ms: queryTime,
total_reviews: rows[0].review_count,
pool_status: {
total_connections: pool.pool.totalConnections,
free_connections: pool.pool.freeConnections,
used_connections: pool.pool.allConnections - pool.pool.freeConnections
,
}timestamp: new Date().toISOString()
;
})
catch (error) {
} .status(503).json({
resstatus: 'unhealthy',
database: 'disconnected',
error: error.message,
timestamp: new Date().toISOString()
;
})
};
})
.exports = router; module
Exercise 4.3: Final Verification
Test all endpoints work:
# Health check
curl http://localhost:4000/health/database
# Get reviews
curl "http://localhost:4000/api/reviews?product_id=1"
# Get aggregate stats
curl "http://localhost:4000/api/reviews/aggregate?product_id=1"
Part 5: Optional Extensions (5 min)
Extension 1: Query Performance Monitoring
// Add query timing to all database operations
const originalExecute = pool.execute;
.execute = async function(sql, params) {
poolconst start = Date.now();
try {
const result = await originalExecute.call(this, sql, params);
const duration = Date.now() - start;
if (duration > 1000) {
console.warn(`🐌 Slow query (${duration}ms):`, sql.substring(0, 50) + '...');
}return result;
catch (error) {
} console.error(`❌ Query failed (${Date.now() - start}ms):`, error.message);
throw error;
}; }
Extension 2: Migration System
// scripts/migrate.js
const fs = require('fs');
const path = require('path');
const { pool } = require('../src/config/database');
async function runMigrations() {
// Create migrations table
await pool.execute(`
CREATE TABLE IF NOT EXISTS migrations (
id INT AUTO_INCREMENT PRIMARY KEY,
filename VARCHAR(255) NOT NULL UNIQUE,
executed_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
)
`);
// Run pending migrations
const migrationsDir = path.join(__dirname, '../migrations');
const files = fs.readdirSync(migrationsDir).filter(f => f.endsWith('.sql')).sort();
for (const file of files) {
const [existing] = await pool.execute('SELECT filename FROM migrations WHERE filename = ?', [file]);
if (existing.length > 0) continue;
console.log(`Running migration: ${file}`);
const sql = fs.readFileSync(path.join(migrationsDir, file), 'utf8');
await pool.execute(sql);
await pool.execute('INSERT INTO migrations (filename) VALUES (?)', [file]);
console.log(`✅ Completed: ${file}`);
}
console.log('🎉 All migrations completed!');
process.exit(0);
}
runMigrations().catch(error => {
console.error('❌ Migration failed:', error);
process.exit(1);
; })
Success Criteria & Assessment
Core Requirements ✅
Testing Checklist ✅
Troubleshooting
Common Issues
Problem: ER_ACCESS_DENIED_ERROR
# Check credentials in .env
DB_USER=root
DB_PASSWORD=password
DB_HOST=127.0.0.1
Problem: ER_NO_SUCH_TABLE
# Run schema migration
mysql -h 127.0.0.1 -P 3306 -u root -ppassword review_aggregator < schema.sql
Problem: ECONNREFUSED
# Check MySQL container
docker ps | grep mysql
# Restart if needed
docker-compose up -d mysql
Resources
🎉 Practice Complete!
Congratulations! You’ve successfully:
- ✅ Integrated MySQL with Express using connection pools
- ✅ Replaced in-memory arrays with persistent database storage
- ✅ Implemented parameterized queries for SQL injection prevention
- ✅ Added transaction support for data consistency
- ✅ Created aggregate statistics calculations
- ✅ Built proper error handling for database operations
You’re ready for full-stack integration! 🚀