I am a final-year Ph.D. student at the Department of Statistics, Oxford, supervised by Dr. Tom Rainforth and Prof. Yee Whye Teh.
Before that, I got my BS and MS degrees from Peking University and worked as a researcher at Bytedance AI lab.
I'm currently working on natural language reasoning and generative models aiming at building real AI systems.
I'm graduating in 2024 and I'm looking for a position in academia to continue my research. If you are interested in my work, please feel free to drop me an email.
SelfCheck: Using LLMs to Zero-Shot Check Their Own Step-by-Step Reasoning Ning Miao, Yee Whye Teh, Tom Rainforth [pdf][code]
Learning Instance-Specific Augmentations by Capturing Local Invariances Ning Miao, Tom Rainforth, Emile Mathieu, Yann Dubois, Yee Whye Teh, Adam Foster, Hyunjik Kim
In ICML, 2023. [bib][pdf][code]
On Incorporating Inductive Biases into VAEs Ning Miao, Emile Mathieu, Siddharth N, Yee Whye Teh, Tom Rainforth
In ICLR, 2022. [bib][pdf][code]
Do You Have the Right Scissors? Tailoring Pre-trained Language Models via Monte-Carlo Methods Ning Miao, YuXuan Song, Hao Zhou, Lei Li
In ACL, 2020. [bib][pdf][code]
Dispersed Exponential Family Mixture VAEs for Interpretable Text Generation
Wenxian Shi, Hao Zhou, Ning Miao, Shenjian Zhao, Lei Li
In ICML, 2020. [bib][pdf][code]
Improving Maximum Likelihood Training for Text Generation with Density Ratio Estimation
YuXuan Song, Ning Miao, Hao Zhou, Lei Li
In AISTATS, 2020. [bib][pdf]
Kernelized Bayesian Softmax for Text Generation Ning Miao, Hao Zhou, Chengqi Zhao, Wenxian Shi, Lei Li
In NeurIPS, 2019. [bib][pdf][code]
Generating Fluent Adversarial Examples for Natural Languages
Huangzhao Zhang, Hao Zhou, Ning Miao, Lei Li
In ACL, 2019. [bib][pdf][code]
Constrained Sentence Generation via Metropolis-Hastings Sampling Ning Miao, Hao Zhou, Lili Mou, Rui Yan, Lei Li
In AAAI, 2019. [bib][pdf][code]
This webpage was built with
You can find the source code
Last updated: Aug 02, 2023