Yuling Yao

Yuling Yao

   
Profile

Yuling Yao


About

I am a Flatiron Research Fellow, at Flatiron Institute, Center for Computational Mathematics. My general research interest lies in Bayesian computation, Bayesian modeling, machine learning, and causal inference.

Before that, I obtained my PhD in Statistics from Columbia University, where I was advised by Andrew Gelman. Before that, I obtained my undergraduate education from Tsinghua University in Mathematics and in Economics.

About my research

• My ultimate goal is to develop a scalable Bayesian workflow for open-ended real data problems. For example, some recent applications included lead fallout in Paris, arsenic diffusion in groundwater, and Covid-19 mortality in Bangladesh.

• But to do better applied statistics needs better methodology development. To that end, I investigate statistical and machine learning methods, with a focuse on model evaluation and aggregation, meta-learning and causal inference. Some ongoing progresses are on cross-validation, stacking and hierarchical stacking, and covariate imbalance.

• But to facilitate complex methods further needs scalable and diagnosable computing. Hence, I develop algorithms and theoreis for fully Bayesian and approximate computations. Recently at Flatiron, I am interested in combining Monte Carlo methods with sophisticated numerical tricks or quadratures, from which the applications include importance sampling, simulated tempering and annealing, dropout, and metastability in MCMC.

Publications  

    Bayesian methodology

Yuling Yao [2021]
Toward a scalable Bayesian workflow. PhD Thesis.
[online]

A scalable Bayesian workflow needs the combination of fast but reliable computing, efficient but targeted model evaluation, and extensive but directed model building and expansion.

Yuling Yao, Gregor Pirš, Aki Vehtari, Andrew Gelman. [2021]
Bayesian hierarchical stacking. preprint.
[preprint]

With the input-varying yet partially-pooled model weights, hierarchical stacking improves average and conditional predictions. Our Bayesian formulation includes constant-weight (complete-pooling) stacking as a special case.

Yuling Yao, Collin Cademartori, Aki Vehtari, Andrew Gelman. [2020]
Adaptive Path Sampling in Metastable Posterior Distributions. under review.
[preprint]   [Package]   [Blog]

From importance sampling to adaptive importance sampling to path sampling to adaptive path sampling, and from Rao–Blackwell to Wang-Landau to Jarzynski-Crook: all about free energy and simulated tempering

Yuling Yao, Aki Vehtari, Andrew Gelman. [2020]
Stacking for Non-mixing Bayesian Computations: The Curse and Blessing of Multimodal Posteriors. under review.
[preprint]   [Code]   [Blog]

The result from multi-chain stacking is not necessarily equivalent, even asymptotically, to fully Bayesian inference, but it serves many of the same goals. Under misspecified models, stacking can give better predictive performance than full Bayesian inference, hence the multimodality can be considered a blessing rather than a curse.

Andrew Gelman, Yuling Yao. [2020]
Holes in Bayesian Statistics. Journal of Physics G,
[Online]

This does not mean that we think Bayesian inference is a bad idea, but it does mean that there is a tension between Bayesian logic and Bayesian workflow which we believe can only be resolved by considering Bayesian logic as a tool, a way of revealing inevitable misfits and incoherences in our model assumptions, rather than as an end in itself.

Yuling Yao. [2019+]
Bayesian Aggregation. under review.
[preprint]

Aki Vehtari, Daniel Simpson, Andrew Gelman, Yuling Yao, Jonah Gabry. [2019+]
Pareto Smoothed Importance Sampling. under review.
[preprint]

How to run importance sampling with effieiciency and reassurance

Aki Vehtari, Daniel Simpson, Yuling Yao, Andrew Gelman [2018]
Limitations of "Limitations of Bayesian leave-one-out cross-validation for model selection". Computational Brain & Behavior.
[Online]  

Yuling Yao, Aki Vehtari, Daniel Simpson, Andrew Gelman [2018]
Yes, But Did it Work?: Evaluating Variational Inference. Proceedings of the 35th International Conference on Machine Learning.
[Online]   [Blog]   [Code]

The Pareto-smoothed importance sampling diagnostic gives a goodness of fit measurement for joint variational approximtion, while simultaneously improving the error in the estimate.

Yuling Yao, Aki Vehtari, Daniel Simpson, Andrew Gelman [2018]
Using Stacking to Average Bayesian Predictive Distributions (With Discussion and Rejoinder). Bayesian Analysis, 13, 917-1003.
[Online]   [Code]   [R package]

"Remember that using Bayes' Theorem doesn't make you a Bayesian. Quantifying uncertainty with probability makes you a Bayesian."

    Applied statistics

Prabhat Barnwal, Yuling Yao (equal contribution), Yiqian Wang, Nishat Akter Juy, Shabib Raihan, Mohammad Ashraful Haque, Alexander van Geen. [2021]
No excess mortality detected in rural Bangladesh in 2020 from repeated surveys of a population of 81000. preprint.
[preprint]

Yuling Yao, Rajib Mozumder, Benjamin Bostick, Brian Mailloux, Charles Harvey, Andrew Gelman, Alexander van Geen. [2021]
Making the most of imprecise measurements: Changing patterns of arsenic concentrations in shallow wells of Bangladesh from laboratory and field data. preprint.
[preprint]

Imprecise but widely-accessible field kit tests in companion with flexible statistical modeling that facilitates this open-ended data gathering can provide a balance between total cost and accuracy in many areas of geoscience research and policy.

Andrew Gelman, Aki Vehtari, Daniel Simpson, Charles Margossian, Bob Carpenter, Yuling Yao, Paul-Christian Bürkner, Lauren Kennedy, Jonah Gabry, Martin Modrák. [2020]
Bayesian workflow. preprint.
[preprint]

Theoretical statistics indeed is the theory of applied statistics.

Alexander van Geen, Yuling Yao, Tyler Ellis, Andrew Gelman. [2020]
Fallout of Lead over Paris from the 2019 Notre-Dame Cathedral Fire. Geohealth .
[Online]   [Code]   [Media coverage (Le Monde)]   [Media coverage 2]

How much lead was there after the fire?

Oscar Chang, Yuling Yao, David Williams-King, Hod Lipson. [2019+]
Ensemble Model Patching: A Parameter-Efficient Variational Bayesian Neural Network. arxiv preprint.
[Online]   [Blog]  

running BNN on ImageNet: more expressive than MC-Dropout, more affordable than meanfield VI

Maarten Marsman, Felix D Schönbrodt, Richard D Morey, Yuling Yao, Andrew Gelman, Eric-Jan Wagenmakers [2016]
A Bayesian bird's eye view of ‘Replications of important results in social psychology’. Royal Society Open Science,4,160426.
[Online]

Yu-Sung Su, Yuling Yao [2015+]
Happy Generations, Depressed Generations: How and Why Chinese People’s Life Satisfactions Vary across Generations, under review.
[preprint]

Yu-Sung Su, Yuling Yao [2015] Is the rice dumpling sweet or salty? Adjusting the selection bias of online surveys by multilevel regression and poststratification. (in Chinese) Journal of Tsinghua University,03,43. [Download]

Software

I am among the developer teams of the following softwares:

    LOO

An R package for efficient approximate leave-one-out cross-validation (LOO) using Pareto smoothed importance sampling (PSIS), a new procedure for regularizing importance weights.

    Stan

Stan is a state-of-the-art platform for statistical modeling and high-performance statistical computation. Thousands of users rely on Stan for statistical modeling, data analysis, and prediction in the social, biological, and physical sciences, engineering, and business.

last updated: Jul 2021
© Yuling Yao
  © 2021 Yuling Yao