Machine Learning Research
San Francisco, USA
Since joining, I've been building out the multimodal ETL and synthetic data generation infrastructure at Liquid AI, enabling the company to scale up training jobs. Most notably, this work led to the release of the LFM2-Audio-1.5B foundation model which achieves conversational quality that rivals 10× larger models.
Machine Learning Research
San Francisco, USA
Having explored Large Language Models (LLMs) as they came out during my time in graduate school, I was keen to be a part of this emerging trend. I joined my former classmate as a founding engineer at Evidium to build evidence-grounded AI systems for healthcare. We focused on scalable architectures for reliably detecting, extracting, and reasoning on medical entities appearing in terabyte-scale data pulled from electronic health records. We built and patenteda knowledge platform, which is currently in use at a U.S. hospital.
Derivatives Structuring
London, UK
I began my career at DB's sales and trading graduate program in Frankfurt, Germany. I rotated in correlation and xVA trading before joining the interest rate derivatives structuring team. As a structurer, my work involved development and pricing of exotic interest rate products for DACH area institutional clients. I later moved to DB's London office to focus on the EMEA corporates and governments structuring business.
Quantitative Research
Remote
During the pandemic and last part of my master's, I researched and implemented Temporal Difference (TD) Learning methods for a proprietary trader in collaboration with Dr. Daniel Bloch. This work was conducted under NDA, but our further research on sample-efficient learning of price densities and derivative payoffs using TD-λ methods is available to read on SSRN.