I’m a machine learning engineer for batteries in Germany. I started as an electrician, built high-voltage battery prototypes in automotive, got close to the embedded stack, and then got pulled deep into machine learning.
These days, I build neural surrogate models for dynamic operation. Before that, I consulted for a Silicon Valley startup after my open-source battery-ML project got me there.
I checked off my travel bucket list early: Work and Travel first, then a solo motorbike trip through Vietnam.
Why batteries, why AI
I got into the field by accident after undergrad, joining an 800V/750kW automotive battery project.
I stayed because batteries are hard in the right way: messy physics, real hardware, tight safety margins, and scale that makes small improvements matter.
I see neural nets as a generic mesh and backprop as a generic solver.
That is how I think about battery ML. It earns its place when it cuts expensive engineering loops and learns what explicit models miss. It gets risky when the benchmark is too neat. It has to prove itself on real systems, rough edges included.
What I bring
- Bridging hardware, physics, and ML. I’ve worked close to battery hardware, simulation, embedded systems, and neural modelling. That helps me translate between layers of the stack and catch assumptions that only break at system level.
- System-level trade-offs. I’ve compared model families on realistic operating profiles. I care less about narrow wins than about holding up inside real engineering workflows.
- Structure in ambiguity. From prototypes to production codebases, I’ve had to turn open-ended problems into testable progress: aligning competing constraints across technical boundaries, then turning uncertainty into experiments.