https://uni-sydney.zoom.us/j/88338146962
Conditional calibration is an approach to controlling the false discovery rate (FDR) under fully or partially known dependence, by separately calibrating a data-dependent rejection rule for each hypothesis. I will discuss the approach in general and describe three concrete applications: (1) the dependence-adjusted Benjamini-Hochberg (dBH) procedure, which uniformly dominates the BH procedure under positive regression dependence and provably controls FDR under general dependence, (2) a calibrated knockoff procedure that uniformly dominates knockoffs, yielding especially large power and stability gains in contexts where knockoff methods underperform, and (3) a conservatively biased estimator for the FDR of a generic model selection algorithm such as the lasso, graphical lasso, or forward stepwise regression.
This talk is based on joint work with Lihua Lei and Yixiang Luo.
Will Fithian is an associate professor of statistics at UC Berkeley. He received his PhD from Stanford University in 2015, advised by Trevor Hastie.