I'm a software engineer who's spent 20+ years deep in systems code — from disaster recovery orchestration at VMware to co-creating Velox and Axiom, the open-source C++ libraries that power query execution across Presto, Spark, and PyTorch at Meta. I like hard problems at the intersection of performance and correctness.
The best execution engine isn't the one that's fastest on benchmarks — it's the one that makes every other system better by existing.
Velox was born from a simple observation: Meta's dozens of specialized data engines kept solving the same problems independently. Rather than optimize in isolation, we built a shared library of execution primitives — vectorized, adaptable, and designed for complex data types from day one. Axiom extends this vision to the front-end: SQL parsing, logical planning, and cost-based optimization that feed directly into Velox. Together, they make it possible to go from a SQL query to optimized execution in a single, composable stack.
Velox is an open-source C++ library that provides the building blocks for data-intensive computation: expression evaluation, aggregation, sorting, joining, and more. Axiom completes the picture — it's the front-end layer that handles SQL parsing, logical planning, cost-based optimization, and query orchestration on top of Velox. Together, they form a full stack: write a SQL query, and the system parses, optimizes, and executes it end to end.
Drop-in C++ replacement for Presto workers, translating plan fragments into Velox plans for execution.
Intel-led integration allowing Velox to execute Spark SQL queries via a JNI bridge and Substrait plans.
Dataframe library for deep learning pipelines, translating operations into Velox plans under the hood.
I'm a regular speaker at VeloxCon, the annual conference for the Velox open-source community, and have presented at all four editions since the event launched. In 2025, I was a keynote speaker at VeloxCon China in Beijing.
Whether it's about Velox, database internals, performance engineering, or interesting technical challenges — I'd love to hear from you.