Four domains of deep experience, shaped by 25+ years of building production systems across industries and continents.
20 years of data work; AI is the natural next chapter.
I build systems that make large datasets useful — from semantic search engines that understand code by meaning, to data consolidation pipelines that merge and verify complex geospatial datasets.
My approach to AI is grounded in production reality: local-first architectures, embedding-based search, and pragmatic LLM integration. The infrastructure and data background means I build AI systems that actually work at scale, not just demos.
I architect solutions that balance ambition with pragmatism. Whether it's evaluating vendor proposals, designing greenfield systems, or modernizing legacy platforms, I bring a vendor-agnostic perspective shaped by decades of hands-on building.
My presales and consulting work helps organizations make informed technology decisions. I evaluate architectures, identify risks, and design solutions that are right-sized for the problem — not the trend.
I build tools that solve real problems — from production billing systems with EU e-invoicing compliance, to real-time fleet monitoring dashboards, to CLI utilities that make complex workflows simple.
My stack is shaped by the problem, not by fashion. Python for data and AI work. PHP and Laravel for web applications. JavaScript/Node for real-time systems. C when performance demands it. The common thread is shipping working software that people rely on.
I design and automate infrastructure that runs reliably at scale. Full infrastructure-as-code stacks with Terraform and Ansible, monitoring with Prometheus and Grafana, and performance tuning from the kernel up through the application layer.
This includes specialized work with GIS tile and vector servers, SSL termination at scale with HAProxy, and cloud/private cloud platform engineering. The goal is always the same: reliable, observable, automated infrastructure that stays out of the way.