Deep Dives
I guess areas I have studied intensively...going deep rather than broadly to understand them.
Large Language Models
Studied the architecture, training, and alignment of large language models from first principles — transformers, RLHF, fine-tuning, and inference optimization. Built tooling around LLMs for document Q&A, code generation, and autonomous agents.
Robotics & Autonomous Systems
Explored the intersection of software, hardware, and intelligence through ROS2-based systems and language-model-controlled robots. Built natural language interfaces for real robot arms and autonomous navigation stacks at UEF Robotics Club.
Systems Programming
Built things from scratch to understand how they actually work: a compiler, an N-gram language model in C, a Redis-compatible server, and a serverless orchestrator. The discipline of systems programming — where every byte and every syscall matters — changed how I think about software at every level.
Machine Learning Infrastructure
Studied the engineering behind ML at scale: training pipelines, embedding-based retrieval, vision models, and production deployment. Applied these to industry matchmaking, spoken AI, and JSONL training data tooling.