I am interested in natural language processing. Here are some areas I have worked on:
- Semantic parsing: Convert input sentences (and their context) into meaning representations,
preferably executable ones such as database queries or computer programs.
Some common themes include learning from distant supervision, retrieval-based parsing, and compositional generalization.
- Web interaction: Interact with possibly unseen web pages based on the given natural language commands.
- Retrieval-augmented models: Design models that can retrieve information and use it to guide predictions.
The explicit retrieval step makes the model more interpretable, and we can update the model by modifying the retrieval index.
- Few-shot learning: Generalize to new domains or tasks from limited training data.
I did my PhD at Stanford University,
where I was advised by Percy Liang.
I was part of the Stanford NLP Group.
Before that, I received my bachelor's degree from MIT.
I like ice cream, languages, and pencil puzzles.