Docker Architecture

UK News Scraper + N8N · docker-compose.yml · Internal Reference

🖥️ Host Machine — Ubuntu Linux (sufideen)
🐳 Docker Network (bridge — default)
⚙️
n8n
docker.n8n.io/n8nio/n8n
host:5678 container:5678
  • restartunless-stopped
  • timezoneEurope/London
  • volumen8n_data → /home/node/.n8n
  • depends_onscraper (healthy)
  • workflowUK News Digest (active)
  • schedule07:00 & 18:00 daily
  • callshttp://scraper:8765/run

⬆ Waits for scraper healthcheck before starting

HTTP GET
/run
JSON response
🐍
scraper
uk-news-scraper (Dockerfile.scraper)
host:8765 container:8765
  • base imagepython:3.12-slim
  • restartunless-stopped
  • entrypointscraper_server.py
  • GET /health→ {"status":"ok"}
  • GET /run→ JSON {subject, htmlBody}
  • volume./output → /app/output
  • scrapesBBC · Guardian · Independent · Sky
✅ HEALTHY
📦 Named Volume — n8n_data (external)
n8n_data → /home/node/.n8n
Stores: workflows, credentials,
execution history, settings
📁 Bind Mount — output/
/home/sufideen/Documents/
uk-news-scraper/output

→ /app/output
Stores: HTML digest files
🔑 Env Variables — N8N
GENERIC_TIMEZONE=Europe/London
TZ=Europe/London
N8N_RUNNERS_ENABLED=true
N8N_ENFORCE_SETTINGS=true
Other services running on host
Portainer
:8000 · :9443
Docker GUI management
Open WebUI
:3000
LLM chat interface
Host Browser
→ localhost:5678
N8N UI access
Gmail (cloud)
SMTP / OAuth2
Receives digest email
End-to-End Data Flow
① Trigger
Cron fires
07:00 / 18:00 GMT
② N8N
HTTP Request
GET /run :8765
③ Scraper
Fetch articles
BBC·Guard·Ind·Sky
④ Build
HTML digest
top 10 per source
⑤ Save
Write to disk
output/digests/
⑥ JSON
Return to N8N
{subject, htmlBody}
⑦ Gmail
Send email
OAuth2 · HTML body
⑧ Inbox
Digest received
mawarajudeen10@