Key Skills
Systems & Architecture
- Designed and deployed a multi-service platform using Proxmox VE and LXC containers
- Service separation: abstraction layer, backend, frontend, cache, internal DB, reverse proxy
- Container boot order planning and service dependency handling
- Internal HTTP routing and reverse proxy setup for clean access patterns
- Basic operational setup: service startup scripts, clean restarts, and isolation between components
Backend Development & APIs
- Built REST endpoints using FastAPI for query routing and analysis execution
- Structured request/response handling for passing config, ranges, and analysis settings
- Input validation and defensive error handling to avoid bad requests breaking services
- Logging of key actions (requests, failures, cache events) for traceability
Data Engineering & Database Integration
- Dynamic database access via a custom abstraction layer (ODBC + REST patterns)
- Worked with SQL Server and PostgreSQL connection handling and query execution
- Designed a consistent data flow to return query results as pandas DataFrames / JSON payloads
- Internal PostgreSQL used for configuration storage and audit-style logging
Caching & Performance
- Implemented Redis caching for raw query reuse and repeat analysis speed-up
- Key-based retrieval for loading historical queries and processed outputs
- TTL handling and cache inspection for debugging and memory awareness
- Reduced repeated load on external databases by avoiding duplicate queries
Time Series Analysis
- Implemented Dynamic Time Warping (DTW) for comparing time ranges or batch signals
- Normalisation and scaling approaches used to support fair comparisons
- Traceback path logic through the DTW cost matrix for alignment interpretation
- Built the analysis pipeline to accept typed inputs and return structured results
Data Handling & Preprocessing
- Built a data dictionary approach to classify incoming columns (datetime, numeric, categorical)
- Datetime parsing and cleanup for mixed precision timestamps
- Missing data handling using safe defaults, warnings, and guard checks
- Standardised typed outputs so downstream analysis modules can assume stable inputs
Frontend Integration (Engineering-Focused UI)
- Dash frontend used to select databases, tables, time windows, and analysis methods
- Frontend-to-backend workflow design: query → cache → process → render → export
- CSV export support for processed results and repeatable analysis outcomes
- State handling for retrieving prior Redis keys and restoring previous work
Engineering Practices
- Modular Python structure: reusable classes, separated responsibilities, clean interfaces
- Practical debugging across distributed services (frontend, backend, Redis, DB)
- Worked with realistic dataset sizes (tens to hundreds of thousands of rows)
- Built with audit and compliance thinking in mind (logging, traceability, internal config DB)
What This Shows
This project shows end-to-end capability: from deploying services and wiring them together, through database access and caching, to time-series analysis and a working web UI that can be used to run repeatable investigations on industrial-style data.