I am a final-year Ph.D. student in the Department of Statistics at Purdue University, advised by Jennifer Neville.
I have also worked closely with Vinayak Rao, Qiang Liu, and Petros Drineas.
I obtained a joint M.S. degree in Statistics and Computer Science from Purdue University in 2015, and a B.S. degree in Statistics from the Special Class for the Gifted Young at the University of Science and Technology of China in 2013.
My general research area is statistical machine learning.
In particular, my research interests include kernel methods for hypothesis testing,
approximate Bayesian inference (MCMC and variational methods),
statistical models for network data,
and randomized sketching algorithms.
Previously, I have worked on nonparametric goodness-of-fit tests for discrete distributions, point-process models for dynamic networks, graph sampling algorithms, and ensemble methods for collective classification.
Jiasen Yang, Qiang Liu*, Vinayak Rao*, and Jennifer Neville. Goodness-of-fit Testing for Discrete Distributions via Stein Discrepancy. In Proceedings of the 35th International Conference on Machine Learning (ICML), 2018.
Agniva Chowdhury, Jiasen Yang, and Petros Drineas. An Iterative, Sketching-based Framework for Ridge Regression. In Proceedings of the 35th International Conference on Machine Learning (ICML), 2018.
Jiasen Yang, Vinayak Rao, and Jennifer Neville. Decoupling Homophily and Reciprocity with Latent Space Network Models. In Proceedings of The 33rd Conference on Uncertainty in Artificial Intelligence (UAI), 2017.
Jiasen Yang, Bruno Ribeiro, and Jennifer Neville. Stochastic Gradient Descent for Relational Logistic Regression via Partial Network Crawls. In Proceedings of The 7th International International Workshop on Statistical Relational AI (StarAI), 2017.
Jiasen Yang, Bruno Ribeiro, and Jennifer Neville. Should We Be Confident in Peer Effects Estimated From Partial Crawls of Social Networks? In Proceedings of The 11th International AAAI Conference on Web and Social Media (ICWSM), 2017.