We are starting the new year with two new papers on causal discovery (both joint work with Mathias Drton):
- Wenyu Chen's paper, Causal structure learning via local graphs has been accepted in the SIAM Journal on Mathematics of Data Science (SIMODS). The paper proposes an efficient causal structure learning in the presence of unmeasured variables by leveraging the local separation property of large (random) networks.
- Shiqing Yu's paper, Directed graphical models and causal discovery for zero-inflated data has been accepted for an oral presentation in the Causal Learning and Reasoning (CLeaR) Conference (9% acceptance). The paper establishes the identifiability of causal relations for a flexible class of models for zero-inflated data and develops algorithms for learning the graph from observational data.
I'm excited to start working on a new NIH XAI grant titled Explainable Machine Learning to Guide Prefrontal Brain Stimulation in collaboration with UW colleagues, Zaid Harchaoui, Azadeh Yazdan and Eric Shea-Brown. This project combines statistical learning and applied math with neurobiology and bioengineering to help understand how the brain’s rich internal dynamics can be harnessed to steer its plasticity. We'll have lots of interesting data to solve really cool statistical problems!
CONGRATULATIONS to three graduating PhD students, Wenyu Chen, Xiudi Li and Kunhui Zhang! Having successfully defended their dissertations, Wenyu and Kunhui are staying local to join Facebook and Amazon, and Xiudi will be moving across the country to do a postdoc at Harvard. What an exciting summer!!
I am honored to receive the 2022 Leo Breiman Award from the American Statistical Association (ASA) Section on Statistical Learning and Data Science (SLDS). This award reflects the excellent work by my amazing team and would not have been possible without the guiadance of my mentors and collaborators over the years.
CONGRATULATIONS to Kun Yue for receiving a best student paper award in the WNAR student paper competition for her recent work on estimating variance components in large linear mixed models.
The newest version of the netgsa package offers considerable speedups and interactive network visualization. Check out the accompanying paper published in PLoS Comp Bio for details and the vignette for examples.
Our paper, Neural Granger Causality, which develops a flexible framework for estimating Granger causality using sparse deep neural networks, has been published at IEEE-TPAMI.
CONGRATULATIONS to our graduating class of 2021, Aaron Hudson and Xu (Steven) Wang. Having successfully passed their final exams, Aaron and Steven will soon start the next chapter of their journey: Aaron as a postdoc at UC-Berkeley and Steven as a data scientist at Facebook. Well done guys!
In our recent paper, Emily Fox and I review the classical notion of Granger causality as well as recent developments based on statistical machine learning.
CONGRATULATIONS to Aaron Hudson for winning the 2021 David P. Byar Early Career Award. You can read more about Aaron's great work and award here and find the award winning paper on arxiv.
I am excited to start working on my new R01 grant focused on developing novel statistical inference procedures for biomedical big data.
In collaboration with Simge Küçükyavuz, I have recently revisited my optimization roots to develop mixed integer programming algorithms for learning directed acyclic graphs (DAGs) from observational data. Our first paper has been accepted in INFORMS Journal on Optimization and is available here. Our second paper, available on arXiv, develops a second order conic program for DAG learning with a data-driven early stopping criterion that guarantees a consistent estimate.
In collaboration with the Promislow Lab, we recently investigated changes in metabolic networks associated with the effect of dietary restrictions on longevity. The resulting paper, highlighted with a perspective article in PLoS Genetics, uses our recent tool for differential network analysis, available in the CorDiffViz package.