Faming Liang

Distinguished Professor,  Department of Statistics,  Purdue University, West Lafayette, IN 47907.

Office: Math Building 520;    Phone: (765) 494-4452;    Email: fmliang@purdue.edu


Courses taught:

  •       Computing for Statistics

  •       Computer Intensive Statistical Methods

  •       Linear Model

  •       Regression Analysis

  •       Probability

  •       Mathematical Statistics

  •       Introduction to Linear Models

  •       Probability (graduate course)

  •       Statistical Inference (graduate course)

  •       Computing for Bioinformatics (graduate course)

  •       Monte Carlo Methods (graduate course

Lecturing Assignment for Spring Semester of 2022:

STAT512 Applied Regression Analysis

 See BrightSpace for the lecture notes and assignments.


Research Interests: 

  • Markov Chain Monte Carlo

  • Big Data
  • Bioinformatics

  • Spatial Statistics
  • Machine Learning

  • Statistical Genetics
  • Stochastic Optimization

Editorial Service

  • Co-Editor, Journal of Computational and Graphical Statistics, 2022-2024

  •  Associate Editor, Journal of Computational and Graphical Statistics, 2006-2021

  •  Associate Editor, Annals of Mathematical Sciences and Applications, 2015-present
  •  Associate Editor, Journal of American Statistical Association, 2010-2018

  •  Associate Editor, Bayesian Analysis, 2010-2016

  •  Associate Editor, Technometrics, 2013-2017
  •  Associate editor, Biometrics, 2006-2008

Honors

  •   IMS fellow, 2013.

  •   ASA fellow, 2011.

  •   Elected member, International Statistical Institute (ISI), 2005.

  •   Dean's Citation Award 2016.

  •   Youden Prize 2017

Selected Publications:

  • Books

  1.  Kendall, W.S., Liang, F., and Wang, J.S. (2005) (Editors) Markov Chain Monte Carlo: Innovations and Applications. World Scientific: Singapore.  ISBN 981-256-427-6.


                                                               
      
      2.   Liang, F., Liu, C. and Carroll, R.J. Advanced Markov chain Monte Carlo: Learning from Past Samples.  Wiley.  ISBN: 978-0-470-74826-8.

                                                               
                                                                

  •      Selected Packages 

  1.    Cheng, Y. and Liang, F. (2016)  RSAgeo: Resampling-Based Analysis of Geostatistical Data, available at https://cran.r-project.org/web/packages/RSAgeo/.

  2.    Jia, B., Liang, F., Shi, R., and Xu, S. (2017) equSA: Estimate Directed and Undirected Graphical Models and Construct Networks, available at https://cran.r-project.org/web/packages/equSA/.
  3.   Jia, B. and Liang, F. (2017)  ICmiss Package and Manual.

  4.   Sun, Y., Song, Q. and Liang, F. (2021) Sparse Deep Learning, available at https://github.com/sylydya/Consistent-Sparse-Deep-Learning-Theory-and-Computation .

  •    50 Selected Publications (from over 120 publications)

  1. Wong, W.H. and Liang, F. (1997) Dynamic weighting in Monte Carlo and optimization, Proc. Natl. Acad. Sci. USA, 94, 14220-14224.

  2. Liu, J.S., Liang, F., and Wong. W.H. (2000) The use of multiple-try method and local optimization in Metropolis sampling, J. Amer. Statist. Assoc.,95, 121-134.

  3. Liang, F. and Wong. W.H. (2000) Evolutionary Monte Carlo sampling: applications to $C_p$ model sampling and change-point problem. Statistica Sinica,10, 317-342.

  4. Liu, J.S., Liang, F., and Wong. W.H. (2001) A theory for dynamic weighting in Monte Carlo, J. Amer. Statist. Assoc., 96, 561-573.

  5. Liang, F. and Wong. W.H. (2001) Real-parameter evolutionary sampling with applications in Bayesian Mixture Models, J. Amer. Statist. Assoc., 96, 653-666. 

  6. Liang, F. and Wong, W.H. (2001) Evolutionary Monte Carlo for Protein Folding simulations, Journal of Chemical Physics , 115, 3374-3380.

  7. Liang, F. (2002) Dynamically Weighted Importance Sampling in Monte Carlo Computation, J. Amer. Statist. Assoc. , 97, 807-821.

  8. Liang, F. (2003) An Effective Bayesian Neural Network Classifier with a Comparison Study to Support Vector Machine,  Neural Computation, 15, 1959-1989.

  9.  Liang, F. (2003) Use of sequential structure in simulation from high dimensional systems, Physical Review E,  67, 56101-56107.

  10. Zhang, J., Liang, F., Dassen, W. and de Gunst, M. (2003)  Search for Haplotype-Interactions that are susceptible to type I diabetes using unphased genotype dataAmerican J. Human Genetics, 73,   1385-1401.

  11. Liang, F. (2004). Generalized 1/k-Ensemble Algorithm, Physical Review E, 69, 66701-66707.

  12.  Liang, F. (2004) Annealing Contour Monte Carlo for Structure Optimization in an Off-lattice Protein ModelJournal of Chemical Physics, 120, 6756-6763.  

  13. Liang, F. (2005) Bayesian neural networks for non-linear time series forecastingStatistics and Computing, 15, 13-29.

  14. Liang, F. (2005) Generalized Wang-Landau algorithm for Monte Carlo Computation. J. Amer. Statist. Assoc., 100, 1311-1327.

  15. Liang, F. (2006) A theory on flat histogram Monte Carlo algorothms. Journal of Statistical Physics, 122,  511-529.

  16. Liang, F., Liu, C. and Carroll, R.J. (2007) Stochastic Approximation in Monte Carlo Computation.   J. Amer. Statist. Assoc., 102, 305-320.
  17. Liang, F. (2007) Continuous Contour Monte Carlo for Marginal Density Estimation with an Application to a Spatial Statistical Model, Journal of Computational and Graphical Statistics, 16(3), 608-632.

  18. Liang, F.  (2007)  Annealing Stochastic Approximation Monte Carlo for Neural Network TrainingMachine Learning, 68(3), 201-233.

  19. Liang, F. and Zhang, J. (2008) Estimating FDR under general dependence using stochastic approximationBiometrika,  95(4), 961-977.

  20. Liang, F. (2009) Improving SAMC Using Smoothing Methods: Theory and Applications to Bayesian Model Selection Problems. The Annals of Statistics37, 2626-2654.

  21. Liang, F. (2009) On the use of SAMC for Monte Carlo integration. Statistics & Probability Letters, 79, 581-587.

  22. Liang, F. (2010) A double Metropolis-Hastings sampler for spatial models with intractable normalizing constants. Journal of Statistical Computing and Simulation, 80, 1007-1022.

  23. Liang, F. (2010) Trajectory averaging for stochastic approximation  MCMC algorithms. Annals of Statistics, 38, 2823-2856.

  24. Liang, F. (2011) Evolutionary Stochastic Approximation Monte Carlo for Global Optimization.  Statistics and Computing, 21, 375-393.

  25. Park, J. and Liang, F. (2012) Bayesian analysis of geostatistical models with an auxiliary lattice. Journal of computational and Graphical Statistics, 21, 453-475.

  26. Shi, X., Zhu, H., Ibrahim, J.G., Liang, F., Lieberman, J. and Styner, M. (2012)  Intrinsic regression models for medial representation of subcortical structures. J. Amer. Statist. Assoc., 107, 12-23. 

  27.  Ryu, D., Liang, F. and Mallick, B.K. (2013). Sea Surface Temperature Modeling using Radial Basis Function Networks with a Dynamically Weighted Particle Filter.  J. Amer. Statist. Assoc., 108, 111-123.

  28. Liang, F., Cheng, Y., Song, Q., Park, J., and Yang, P. (2013). A Resampling-based Stochastic Approximation Method for Analysis of Large Geostatistical Data. J. Amer. Statist. Assoc., 108, 325-339.

  29. Liang, F., Song, Q., and Yu, K. (2013).  Bayesian Subset Modeling for High Dimensional Generalized Linear Models.  J. Amer. Statist. Assoc., 108, 589-606.

  30. Song, Q., Wu, M. and Liang, F. (2014). Weak Convergence Rates of Population versus Single-Chain Stochastic Approximation MCMC Algorithms. Advances in Applied Probability, 46, 1059-1083. (Supplementary Material)

  31. Liang, F., Cheng, Y., and Guang, L. (2014). Simulated Stochastic Approximation Annealing for Global Optimization with a Square-root Cooling Schedule. J. Amer. Statist. Assoc.109, 847-863.

  32. Song, Q. and Liang, F. (2015). A Split-and-Merge Bayesian Variable Selection Approach for Ultra-high dimensional Regression. J. Royal Statist. Soc. B, 77(5), 947-972.

  33. Song, Q. and Liang, F. (2015). High Dimensional Variable Selection with Reciprocal L1-Regularization. J. Amer. Statist. Assoc., 110, 1607-1620.

  34. Liang, F., Song, Q. and Qiu, P. (2015). An Equivalent Measure of Partial Correlation Coefficients for High Dimensional Gaussian Graphical Models. J. Amer. Statist. Assoc., 110, 1248-1265.
  35. Liang, F., Jin, I.H., Song, Q. and Liu, J.S. (2016). An Adaptive Exchange Algorithm for Sampling from Distribution with Intractable Normalizing Constants. J. Amer. Statist. Assoc., 111, 377-393.

  36. Liang, F., Shi, R. and Mo, Q. (2016). A split-and-merge approach for singular value decomposition of large-scale matrices. Statistics and Its Interface, 9, 453-459.

  37.  Liang, F., Kim, J. and Song, Q. (2016). A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data.  Technometrics, 58, 304-318.

  38. Xue, J. and Liang, F. (2017). A Robust Model-Free Feature Screening Method for Ultrahigh-Dimensional Data. Journal of Computational and Graphical Statistics, 26(4), 803-813.

  39. Liang, F., Li, Q. and Zhou, L.  (2018) Bayesian Neural Networks for Selection of drug sensitive Genes. J. Amer. Statist. Assoc., 113, 955-972.

  40. Liang, F., Jia, B., Xue, J., Li, Q. and Luo, Y. (2018) An imputation-regularized optimization algorithm for high-dimensional missing data problems and beyond. J. Royal Statist. Soc. B,  80(5), 899-926.  

  41. Li, Q., Shi, R. and Liang, F. (2019). Drug Sensitivity Prediction with High-Dimensional Mixture Regression (Algorithm Implementation). PLoS One,  14(2): e0212108.

  42. Xue, J. and Liang, F. (2019) Double-parallel Monte Carlo for Bayesian analysis of big data.  Statistics and Computing, 29(1), 23-32.
  43. Wei Deng, Xiao Zhang, Faming Liang, and Guang Lin (2019). An Adaptive Empirical Bayesian Method for Sparse Deep Learning.  NeurIPS 2019, in press.

  44.  Song, Q., Sun, Y., Ye, M., and Liang, F. (2020). Extended Stochastic Gradient MCMC for Large-Scale Bayesian Variable Selection. Biometrika, in press.

  45. Wei Deng, Qi Feng, Liyao Gao, Faming Liang, and Guang Lin (2020) Non-convex Learning via Replica Exchange Stochastic Gradient MCMC. ICML 2020, in press.

  46. Deng, W., Lin, G. and Liang, F. (2020). A Contour Stochastic Gradient Langevin Dynamics Algorithm for Simulations of Multi-modal Distributions.  NeurIPS 2020, in press.

  47. Liang, F., Xue, J. and Jia, B. (2020). Markov neighborhood regression for high-dimensional inference. Journal of the American Statistical Association, in press.

  48. Sun, Y., Song, Q. and Liang, F. (2021). Consistent sparse deep learning: Theory and Computation. Journal of the American Statistical Association, in press.

  49. Sun, Y., Xiong, W. and Liang, F. (2021). Sparse deep learning: A new framework immune to local traps and miscalibration. NeurIPS 2021, in press.
  50. Sun, Y. and Liang, F. (2022). A Kernel-Expanded Stochastic Neural Network. J. Royal Statist. Soc. B, in press.