Practice 5 — Database Integration and full-stack wiring
Express + MySQL: hands-on practice.
A practice for integrating an Express API with MySQL and the React frontend. Students should already know basic SQL and MySQL. The session focuses on practical patterns: connection pools, parameterized queries, transactions and error handling.
Practice Overview
This practice is two-fold:
- a minimal Express + MySQL integration exercise, to show how to set up a connection pool, use parameterized queries to avoid SQL injection, and implement basic Create/Read/Update/Delete operations with error handling from an express API.
- a full-stack wiring exercise to connect a React frontend to the Express + MySQL backend.
for this session we will use a smaller, self-contained example: a Customer Management mini-API and UI. Students will implement simple relations: customers and orders, practice parameterized queries, and transactions.
Backend: Express + MySQL integration
Setup
First we create a minimal Express project for the exercises and apply the SQL schema to your local MySQL server.
- Create the project and install dependencies
# create project folder customer-workshop, enter it, init the project
mkdir customer-workshop && cd customer-workshop
npm init -y
# install dependencies to connect to create the server, MySQL, manage env vars and logging.
npm install express mysql2 dotenv morgan cors
# (optional) dev helper to auto-restart server on code changes
npm install --save-dev nodemon
# create a minimal src folder : src, src/routes, src/config
mkdir -p src/routes src/config- Create a
.envfile in the project root. It will hold your database connection info. Adapt the values appropriately. EspeciallyDB_PASSWORDandDB_NAME:
DB_HOST=127.0.0.1
DB_PORT=3306
DB_USER=root
DB_PASSWORD=your_password
DB_NAME=workshop_db
PORT=4000- Apply the SQL schema
Create the file database-schema.sql with the following to create the two tables used in the exercises.
CREATE DATABASE IF NOT EXISTS workshop_db;
USE workshop_db;
CREATE TABLE customers (
id INT AUTO_INCREMENT PRIMARY KEY,
name VARCHAR(255) NOT NULL,
email VARCHAR(255) NOT NULL UNIQUE,
phone VARCHAR(50),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
CREATE TABLE orders (
id INT AUTO_INCREMENT PRIMARY KEY,
customer_id INT NOT NULL,
total DECIMAL(10,2) NOT NULL,
order_date DATE,
status VARCHAR(50) DEFAULT 'pending',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (customer_id) REFERENCES customers(id) ON DELETE CASCADE,
INDEX idx_customer_id (customer_id)
);(OPTIONAL) If you have Docker running, you can spin up a MySQL container for testing and apply the schema directly with the following command (stop any existing MySQL server on port 3306 first or other docker containers using that port using Docker Desktop UI):
docker run -v $(pwd)/database-schema.sql:/docker-entrypoint-initdb.d/database-schema.sql:ro -e MYSQL_ROOT_PASSWORD=your_password -p3306:3306 -d mysql:8Execute the schema against your MySQL server (adapt user/password/host/port as needed).
Connection pool
The first step is to set up a connection pool to manage database connections efficiently. A connection pool allows us to reuse existing connections, reducing the overhead of establishing new connections for each request. This is easy with the mysql2/promise library.
Create src/config/database.js with the following content:
// src/config/database.js
const mysql = require('mysql2/promise');
require('dotenv').config();
const pool = mysql.createPool({
host: process.env.DB_HOST || '127.0.0.1',
port: process.env.DB_PORT ? parseInt(process.env.DB_PORT,10) : 3306,
user: process.env.DB_USER || 'root',
password: process.env.DB_PASSWORD || '',
database: process.env.DB_NAME || 'workshop_db'
});
async function testConnection() {
try {
await pool.execute('SELECT 1');
console.log('✅ Database connected successfully');
return true;
} catch (err) {
console.error('❌ Database connection failed:', err.message);
return false;
}
}
module.exports = { pool, testConnection };This file creates a connection pool and a test connection function that executes a simple test query on startup to verify the connection. It exports the pool object and the testConnection function for use in other parts of the application.
Starting the server with DB connection test
Now we create src/app.js to set up the Express app with middleware and routes:
Later when you copy the provided src/routes/customers.js into your project, remember to uncomment the router mounting line so the endpoints become active.
// src/app.js
const express = require('express');
const morgan = require('morgan');
const cors = require('cors');
require('dotenv').config();
// IMPORTANT:
// UNCOMMENT THE NEXT LINE WHEN YOU HAVE IMPLEMENTED THE ROUTES
const customersRouter = require('./routes/customers');
const app = express();
app.use(morgan('dev'));
app.use(express.json());
app.use(cors({ origin: process.env.FRONTEND_ORIGIN || 'http://localhost:5173', credentials: true }));
app.get('/health', (req, res) => res.json({ status: 'ok' }));
// IMPORTANT:
// UNCOMMENT THE NEXT LINE WHEN YOU HAVE IMPLEMENTED THE ROUTES
//app.use('/api/customers', customersRouter);
// 404
app.use((req, res) => res.status(404).json({ success: false, error: 'Not Found' }));
// error handler
app.use((err, req, res, next) => {
console.error(err);
if (res.headersSent) return next(err);
res.status(err.status || 500).json({ success: false, error: err.message || 'Internal Server Error' });
});
module.exports = app;And finally, create src/index.js to start the server after testing the database connection:
// src/index.js
const app = require('./app');
const { testConnection } = require('./config/database');
const PORT = process.env.PORT ? parseInt(process.env.PORT, 10) : 4000;
async function start() {
try {
const ok = await testConnection();
if (!ok) {
console.error('Database connection failed - exiting');
process.exit(1);
}
const server = app.listen(PORT, () => {
console.log(`Server listening: http://localhost:${PORT}`);
});
const shutdown = (signal) => {
console.log(`${signal} received - closing server`);
server.close(() => process.exit(0));
setTimeout(() => process.exit(1), 10000);
};
process.on('SIGINT', () => shutdown('SIGINT'));
process.on('SIGTERM', () => shutdown('SIGTERM'));
} catch (err) {
console.error('Startup error:', err);
process.exit(1);
}
}
start();Run the server with node src/index.js or npx nodemon src/index.js if you installed nodemon.
You should see a successful database connection message if everything is set up correctly.
The API querying the database
The example route implementation below is a complete, copyable CRUD (Create, Read, Update and Delete operations) reference for src/routes/customers.js. Use this as the reference implementation.
// src/routes/customers.js
const express = require('express');
const mysql = require('mysql2');
// Import the shared connection pool. Using a pool is best practice for web apps:
// it maintains multiple reusable connections to the database which improves
// performance and avoids the overhead of opening/closing a connection per request.
const { pool } = require('../config/database');
const router = express.Router();
// Small helper: parse an integer with a default value when parsing fails.
// We use this to validate numeric query params like `limit` and `offset`.
function toInt(v, def) {
const n = Number.parseInt(v, 10);
return Number.isNaN(n) ? def : n;
}
// GET /api/customers?q=&limit=&offset=
// List customers with optional search and pagination.
// Teaching notes:
// - `req.query` contains parsed query string parameters.
// - We validate and cap `limit` to avoid expensive queries returning huge results.
// - We use prepared statements (parameterized queries) to prevent SQL injection.
// - We use the shared `pool` to run queries via `pool.query` or `pool.execute`.
router.get('/', async (req, res, next) => {
try {
const q = req.query.q ? String(req.query.q).trim() : '';
const limit = Math.min(toInt(req.query.limit, 20), 100);
const offset = Math.max(toInt(req.query.offset, 0), 0);
// Build a parameterized query. We only interpolate the LIMIT/OFFSET after
// validating them as numbers to avoid any chance of SQL injection via
// those values (some drivers don't allow placeholders in LIMIT/OFFSET).
let sql = 'SELECT id, name, email, phone, created_at FROM customers';
const params = [];
if (q) {
// Use LIKE with wildcards for simple search functionality. The `?`
// placeholders are safe — the driver will escape values appropriately.
sql += ' WHERE name LIKE ? OR email LIKE ?';
params.push(`%${q}%`, `%${q}%`);
}
// Append ORDER/LIMIT/OFFSET. We validated `limit` and `offset` above.
sql += ' ORDER BY created_at DESC LIMIT ? OFFSET ?';
// mysql.format creates a single SQL string with proper escaping for
// placeholders. We use it because some pool methods require a complete
// query string instead of a query+params pair.
const formattedSql = mysql.format(sql, [...params, limit, offset]);
const [rows] = await pool.query(formattedSql);
// total count for pagination: run a separate COUNT(*) query using the
// same WHERE clause so the client knows how many total rows exist.
let countSql = 'SELECT COUNT(*) as total FROM customers';
const countParams = [];
if (q) {
countSql += ' WHERE name LIKE ? OR email LIKE ?';
countParams.push(`%${q}%`, `%${q}%`);
}
// We can use `execute` here because we provide the SQL and parameters
// separately — the driver will prepare/execute safely.
const [countRows] = await pool.execute(countSql, countParams);
res.json({ success: true, data: rows, meta: { total: countRows[0].total, limit, offset } });
} catch (err) {
next(err);
}
});
// GET /api/customers/:id
router.get('/:id', async (req, res, next) => {
try {
const id = toInt(req.params.id, null);
if (!id) return res.status(400).json({ success: false, error: 'invalid id' });
// Use a parameterized query when inserting variables into SQL.
// `pool.execute` is convenient: it accepts SQL with `?` placeholders
// and an array of values which are safely escaped by the driver.
const [rows] = await pool.execute('SELECT id, name, email, phone, created_at FROM customers WHERE id = ?', [id]);
if (!rows.length) return res.status(404).json({ success: false, error: 'Customer not found' });
res.json({ success: true, data: rows[0] });
} catch (err) {
next(err);
}
});
// POST /api/customers
// Create a new customer. Important teaching points here:
// - Validate required fields coming from `req.body`.
// - Return appropriate HTTP status codes (201 on created, 400 on bad request,
// 409 on duplicate email, etc.).
// - Use try/catch and pass unexpected errors to `next(err)` so Express can
// handle them in a centralized error handler.
router.post('/', async (req, res, next) => {
try {
const { name, email, phone } = req.body || {};
if (!name || !email) return res.status(400).json({ success: false, error: 'name and email required' });
// INSERT using placeholders. If the `email` column has a UNIQUE constraint
// in the DB, a duplicate will raise an error which we catch below.
const [result] = await pool.execute(
'INSERT INTO customers (name, email, phone) VALUES (?, ?, ?)',
[name, email, phone || null]
);
// Fetch the inserted row to return to the client. This gives the client
// the authoritative created_at timestamp and generated id.
const [rows] = await pool.execute('SELECT id, name, email, phone, created_at FROM customers WHERE id = ?', [result.insertId]);
res.status(201).json({ success: true, data: rows[0] });
} catch (error) {
if (error && error.code === 'ER_DUP_ENTRY') {
return res.status(409).json({ success: false, error: 'Customer with that email already exists' });
}
next(error);
}
});
// PUT /api/customers/:id
// Update a customer. We dynamically build the SET clause so that partial
// updates are possible (PATCH-like behavior). Always validate inputs and
// protect against SQL injection by using placeholders for values.
router.put('/:id', async (req, res, next) => {
try {
const id = toInt(req.params.id, null);
if (!id) return res.status(400).json({ success: false, error: 'invalid id' });
const { name, email, phone } = req.body || {};
if (!name && !email && phone === undefined) return res.status(400).json({ success: false, error: 'nothing to update' });
const fields = [];
const params = [];
if (name) { fields.push('name = ?'); params.push(name); }
if (email) { fields.push('email = ?'); params.push(email); }
if (phone !== undefined) { fields.push('phone = ?'); params.push(phone || null); }
params.push(id);
const sql = `UPDATE customers SET ${fields.join(', ')} WHERE id = ?`;
// Execute the UPDATE. The driver will escape the params in the `params`
// array so even though the SQL string contains a dynamic fields list, the
// actual values are still passed safely via placeholders.
const [result] = await pool.execute(sql, params);
if (result.affectedRows === 0) return res.status(404).json({ success: false, error: 'Customer not found' });
const [rows] = await pool.execute('SELECT id, name, email, phone, created_at FROM customers WHERE id = ?', [id]);
res.json({ success: true, data: rows[0] });
} catch (error) {
if (error && error.code === 'ER_DUP_ENTRY') {
return res.status(409).json({ success: false, error: 'Customer with that email already exists' });
}
next(error);
}
});
// DELETE /api/customers/:id
// Delete a customer by id. Deletions should confirm the resource existed
// by checking affectedRows and return 404 when not found.
router.delete('/:id', async (req, res, next) => {
try {
const id = toInt(req.params.id, null);
if (!id) return res.status(400).json({ success: false, error: 'invalid id' });
const [result] = await pool.execute('DELETE FROM customers WHERE id = ?', [id]);
if (result.affectedRows === 0) return res.status(404).json({ success: false, error: 'Customer not found' });
res.json({ success: true, message: 'Deleted' });
} catch (err) {
next(err);
}
});
// Example transaction endpoint (optional)
// Transactions are used when multiple related statements must all succeed or
// all fail together (atomicity). Here we create a customer and multiple
// orders in the same transaction so the DB is never left in a partial state.
router.post('/batch-orders', async (req, res, next) => {
// Acquire a dedicated connection from the pool so we can control the
// transaction boundaries (BEGIN/COMMIT/ROLLBACK) on that connection.
const conn = await pool.getConnection();
try {
const { customer, orders } = req.body || {};
if (!customer || !orders || !Array.isArray(orders)) {
conn.release();
return res.status(400).json({ success: false, error: 'customer and orders[] required' });
}
await conn.beginTransaction();
const [insCustomer] = await conn.execute('INSERT INTO customers (name, email, phone) VALUES (?, ?, ?)', [customer.name, customer.email, customer.phone || null]);
const customerId = insCustomer.insertId;
// Insert many orders. We use the same connection so they're part of the
// transaction. If any insert fails, we roll back the whole transaction.
const orderPromises = orders.map(o => conn.execute('INSERT INTO orders (customer_id, total, order_date, status) VALUES (?, ?, ?, ?)', [customerId, o.total, o.order_date || null, o.status || 'pending']));
await Promise.all(orderPromises);
await conn.commit();
conn.release();
res.status(201).json({ success: true, message: 'Customer and orders created' });
} catch (err) {
// On error, rollback to return the DB to the previous consistent state.
await conn.rollback().catch(() => {});
conn.release();
next(err);
}
});
module.exports = router;Now that you have understood the concepts of creating route handlers with database access you can use AI tools to generate similar code for other entities in your review aggregator project. You can also look at the fullstack minimal application to study a more complete example.
Quick API tests — curl and VS Code REST Client
To test the backend we use the VS Code REST Client extension. Create a file api-tests.http inside the project root (or in the frontend folder for frontend-to-backend calls). Paste the following to get clickable requests inside VS Code:
These examples assume your backend is running on http://localhost:4000 and the API base path is /api.
@host = http://localhost:4000
### Health
GET {{host}}/health
### List customers (default)
GET {{host}}/api/customers
### Search customers (pagination)
GET {{host}}/api/customers?q=alice&limit=5&offset=0
### Create customer Alice
POST {{host}}/api/customers
Content-Type: application/json
{
"name": "Alice Example",
"email": "alice@example.test",
"phone": "+1-555-0100"
}
### Create customer Bob
POST {{host}}/api/customers
Content-Type: application/json
{
"name": "Bob Example",
"email": "bob@example.test",
"phone": "+1-555-0101"
}
### Get customer by id
GET {{host}}/api/customers/1
### Update customer
PUT {{host}}/api/customers/1
Content-Type: application/json
{
"name": "Alice Updated",
"phone": "+1-555-9999"
}
### Delete customer
DELETE {{host}}/api/customers/1
### Batch orders (transaction)
POST {{host}}/api/customers/batch-orders
Content-Type: application/json
{
"customer": { "name": "Batch Customer", "email": "batch@example.test", "phone": null },
"orders": [
{ "total": 12.50, "order_date": "2025-10-23", "status": "paid" },
{ "total": 7.30 }
]
}Click the “Send Request” link that appears above each request to run it from VS Code.
Quick Full-stack integration
To explain the fullstack integration we create a minimal react frontend for customers.
So in another terminal create a React project :
# scaffold a React app with Vite
npm create vite@latest customer-frontend -- --template reactstop the frontend server with ctrl-c
Now we add the axios dependency used to call the backend api.
cd customer-frontend
npm install axioswe create a configuration file for the frontend .env.local
# .env.local
VITE_BACKEND_URL=http://localhost:4000
Here, the sample application is minimal made of only one component src/App.jsx.
To understand how the frontend calls the backend API and displays the customers list, study the fetchCustomers function closely — it shows how to request paginated/search results from the server, how to extract and store the returned data, and how to handle errors and loading states. For form behaviour, examine handleSubmit: it demonstrates validating inputs, posting JSON (JavaScript Object Notation) to the API, clearing the form on success, and triggering a UI refresh afterwards.
When reviewing the component, pay particular attention to the use of React hooks:
useStateis used to keep form fields and the customers array in sync with the UI. Notice how controlled inputs make it easy to read values when submitting and to reset fields after success.useEffectis used to fetch data when the component mounts (and optionally when pagination/search parameters change). This illustrates the component lifecycle: side effects run at the right times so the UI reflects the server state.
// src/App.jsx
// This file demonstrates how a React frontend can interact with a backend API.
// We'll use axios for HTTP requests and React hooks for state management.
// Key concepts: API calls, async/await, error handling, environment variables.
import React, { useState, useEffect } from 'react';
import axios from 'axios';
// Environment variable for backend URL - allows different URLs for dev/prod
// Falls back to empty string if not set (for local development)
const BACKEND = import.meta.env.VITE_BACKEND_URL || '';
console.log('Loaded BACKEND URL:', BACKEND); // Debug: Check if .env.local is loaded
function App() {
// State for storing the list of customers fetched from backend
const [customers, setCustomers] = useState([]);
// State for form inputs - controlled components in React
const [name, setName] = useState('');
const [email, setEmail] = useState('');
const [phone, setPhone] = useState('');
// useEffect runs once when component mounts (empty dependency array [])
// This fetches initial data from the backend
useEffect(() => {
fetchCustomers();
}, []);
// Function to fetch customers from backend API
// Uses async/await for asynchronous operations
const fetchCustomers = async () => {
try {
// GET request to backend endpoint
// BACKEND is the base URL, /api/customers is the endpoint
const res = await axios.get(`${BACKEND}/api/customers`);
// Backend likely returns { data: { data: [...] } } structure
// We extract the array of customers, defaulting to empty array
setCustomers(res.data.data || []);
} catch (err) {
// Error handling: log error and set empty customers list
console.error('Failed to fetch customers:', err);
setCustomers([]);
}
};
// Function to handle form submission (adding new customer)
const handleSubmit = async (e) => {
e.preventDefault(); // Prevent default form submission behavior
try {
// POST request to create new customer
// Sends form data as JSON in request body
await axios.post(`${BACKEND}/api/customers`, {
name,
email,
phone
});
// Clear form inputs after successful submission
setName('');
setEmail('');
setPhone('');
// Refetch customers to update the list with new customer
fetchCustomers();
} catch (err) {
// Error handling: log detailed error information
// err.response?.data contains backend error response
// err.message contains general error message
console.error('Failed to create customer:', err?.response?.data || err.message);
}
};
return (
<div style={{ padding: 20, fontFamily: 'system-ui, sans-serif' }}>
<h1>Customer Management</h1>
{/* Form for adding new customers */}
{/* onSubmit calls handleSubmit when form is submitted */}
<form onSubmit={handleSubmit} style={{ marginBottom: 20 }}>
{/* Controlled inputs: value and onChange keep state in sync */}
<input
value={name}
onChange={e => setName(e.target.value)}
placeholder="Name"
required // HTML5 validation
/>
<input
value={email}
onChange={e => setEmail(e.target.value)}
placeholder="Email"
type="email" // HTML5 email validation
required
/>
<input
value={phone}
onChange={e => setPhone(e.target.value)}
placeholder="Phone"
// Phone is optional (no required attribute)
/>
<button type="submit">Add Customer</button>
</form>
{/* Display list of customers */}
{/* map() creates list items for each customer */}
<ul>
{customers.map(c => (
<li key={c.id}>
{/* Display customer name and email */}
{c.name} — {c.email}
</li>
))}
</ul>
</div>
);
}
export default App;Now run the frontend development server:
npm run devWhen the database, the back and the front end are all running, you should be able to use the fullstack application.
Back to your review aggregator project
This minimal fullstack example can be extended to cover more complex scenarios in your review aggregator project.
You can extend your backend to use similar patterns for other entities like reviews and products. Implement CRUD routes for these entities following the same structure as shown in the customers routes. Or even improve it with AI tools.
You can extend your front to call those new endpoints and manage state for products and reviews which are persisted in your MySQL database.