I'm a fifth-year Computer Science PhD candidate at Stanford, where I'm co-advised by Chris Ré and Kayvon Fatahalian, affiliated with the Stanford AI Lab, Stanford CRFM, the Stanford Machine Learning Group, DAWN, and the Stanford Computer Graphics Lab. I'm also an academic partner with Together. In my research, I develop ML and systems algorithms to break crucial modeling bottlenecks. Recently, I've been very interested in enabling longer-sequence models, with a line of work starting with H3 and FlashAttention, and continuing on with Hyena. This blog post summarizes our past two years of work on increasing the context length of foundation models.
In the past, I've also done work on labeling (Liger, FlyingSquid), data (TABi, Thanos), and human (Rekall) bottlenecks. I'm very fortunate to be supported by a Department of Defense NDSEG fellowship, a Magic Grant from the Brown Institute (2019-2020), and multiple grants from Stanford HAI.
I co-founded the Stanford MLSys Seminar Series in Fall 2020 - we give talks every Thursday, livestreamed on YouTube. We started it as just a fun way to talk to people during the height of COVID, and we've been amazed and humbled by the uptake. Since then we've turned it into a class at Stanford, and almost 10,000 people have subscribed to our channel. Check out our website, and subscribe to our YouTube to join us on this journey!
In 2018, I graduated from Harvard with an AB and an SM in Computer Science, cum laude with highest thesis honors. When I'm not working on school work or other projects, I spend most of my free time ballroom dancing.