FreeThrow
Learned predictions accelerating HNSW vector search indexes by 2×–3× at zero overhead — via embedding model tuning and graph decomposition.
view →Data Systems · AI
Research and software. I build things and make things go brr
Mostly in/around the Data and AI Systems Lab.
Formerly at DRW, TileDB, Bloomberg.
Concurrent
B.A./M.S., Harvard University.
I'm originally from Bucharest, Romania. I moved to the US in 2022 to study CS at Harvard.
I love thinking about data systems and AI.
I recently started sailing at MIT, so you might see me there once the Charles unfreezes lol.
I love reading and here are some of my current favorites:
Learned predictions accelerating HNSW vector search indexes by 2×–3× at zero overhead — via embedding model tuning and graph decomposition.
view →ANN index with hybrid on-disk/in-memory design: small vector prefix in DRAM, bulk on NVMe. 100× speedup over IVF_FLAT via adaptive projections (Johnson–Lindenstrauss), minimal data movement, smart query plans, and SIMD kernels.
view →Dynamic LoRA expert retrieval for continual learning. Embedding-based routing selects the nearest specialist adapter at inference time: +10% absolute / +25% relative over Qwen 2.5-Coder on CodeFlowBench.
view →Exposed multiple zero day exploits in Perplexity Comet and BrowserBase. I have reported them with reproducible instructions. Writing a paper on suggesting architectural changes to Agentic AI attention mechanisms to mitigate prompt injection attacks.
| Languages | C++ · Python · JavaScript / TypeScript · P4 · HTML/CSS · Verilog |
|---|---|
| Domains | Database internals · High-performance computing · ML Systems |
| Competitive Programming | Codeforces Candidate Master · National Informatics Olympiad, Romania |