Powerful Knockoffs via Minimizing Reconstructability

Here is a general-purpose modular Python package efficiently implementing many existing methods for knockoffs (such as generating knockoffs and computing test statistics), including a framework for generating knockoffs that minimize the reconstructability, which can provide a big increase in power over existing knockoff generators when covariates are correlated. And here is code for precisely replicating the experiments in the associated paper.

Floodgate: Inference for Model-Free Variable Importance

An R package and tutorials can be found here for running floodgate to provide a lower confidence bound for the minimum mean squared error gap, an interpretable model-free measure of variable importance that is sensitive to arbitrary nonlinearities and interactions.

The dCRT: An Exact, Fast, and Powerful Conditional Independence Test

R code and examples can be found here for running the distilled conditional randomization test (dCRT), a much faster way to run the conditional randomization test (CRT) for exact and powerful conditional independence testing.

Conditional Knockoffs: Relaxing the Assumptions of Knockoffs

R code can be found here and tutorials can be found here for running conditional knockoffs, which are a way to run model-X knockoffs (with all the same guarantees and nearly as much power) without assuming the distribution of X is known, only that a flexible parametric model for the covariate distribution is known.

Metropolized Knockoff Sampling

R and Python code can be found here and tutorials/notebooks can be found here for running the Metropolized knockoff sampler to flexibly construct exact model-X knockoffs using tools from the Markov chain Monte Carlo and graphical models literature. A new, more general knockoffs Python package is available here that includes an implementation of the Metropolized knockoff sampler.

Model-X Knockoffs: High-Dimensional Controlled Variable Selection

R, Python, and MATLAB packages, as well as examples/vignettes can be found here for running model-X knockoffs to perform variable selection while controlling the false discovery rate, even in high dimensions and when the conditional model for the response variable is unknown. A newer, more general knockoffs Python package is available here.

EigenPrism: Inference for High-Dimensional Signal-to-Noise Ratios

R function for running EigenPrism to compute confidence intervals for the norm of the coefficient vector, noise level, or signal-to-noise ratio in high-dimensional regression problems without assuming sparsity or random effects. MATLAB code implementing the EigenPrism procedure and the simulations and analyses in the paper.

Familywise Error Rate Control Via Knockoffs

MATLAB code implementing knockoffs for familywise error rate control and reproducing figures in paper. Knockoffs allows the user to control the familywise error rate in linear regression problems with more observations than variables.

QUARTS: A Robust Method for Paleoclimate Reconstructions

R Code implementing QUAntile Regression with Time Series errors (QUARTS). Includes script for applying QUARTS to a Northern Hemisphere paleoclimate reconstruction.

Monte Carlo Motion Planning (MCMP)

This github repository contains a Julia implementation of MCMP, which is an algorithm for autonomously planning (in the presence of uncertainty) a robot's trajectory through obstacles with a prespecified lower-bound on the probability of success (e.g. 99%).

Fast Marching Tree (FMT*)

The Open Motion Planning Library contains an open-source C++ implementation of FMT*. FMT* is an asymptotically-optimal algorithm for autonomously planning a robot's trajectory through obstacles.