picture of me taken by Sury.

Daniel G. Alabi

Instructor, NaijaCoder
Junior Fellow, Simons Society of Fellows
Postdoctoral Researcher, Computer Science and Data Science, Columbia University
Postdoctoral Host: Daniel Hsu
alabid [at] cs [dot] columbia [dot] edu
Photo credit: My neighbor Sury @scrodcity

Short Bio

I received my Ph.D. in Computer Science from Harvard University, where I was advised by Salil Vadhan. During my program, I was a research intern at Microsoft Research in New England and the Discrete Algorithms Group at Google Research. In 2019, I received an S.M. from Harvard and before that obtained a B.A. in mathematics and a B.A. in computer science from Carleton College.

Research Overview

In general, I study the limits and capabilities of statistics for computer science applications. In particular, I'm interested in the tradeoffs in computational and statistical resources that result when we require provable guarantees (e.g., for privacy) on statistical models and machine learning predictors. Example computational resources include time, memory, randomness, communication, and parallelism. Example statistical resources include samples (or labeled examples for classification or regression) drawn from an unknown distribution.

Selected Papers

Hypothesis Testing for Differentially Private Linear Regression
with Salil Vadhan
Proceedings of the Advances in Neural Information Processing Systems (NeurIPS), 2022  [arxiv] 
Learning to Prune: Speeding up Repeated Computations
with Adam Tauman Kalai, Katrina Ligett, Cameron Musco, Christos Tzamos, and Ellen Vitercik
Proceedings of the Conference on Learning Theory (COLT), 2019  [arxiv]  [PMLR] 
Unleashing Linear Optimizers for Group-Fair Learning and Optimization
with Nicole Immorlica and Adam Tauman Kalai
Proceedings of the Conference on Learning Theory (COLT), 2018  [arxiv]  [PMLR] 
Learning Certifiably Optimal Rule Lists for Categorical Data
with Elaine Angelino, Nicholas Larus-Stone, Margo Seltzer, and Cynthia Rudin
Proceedings of the Journal of Machine Learning Research (JMLR), 2018  [arxiv]  [JMLR] 
See all papers here.

Selected Code Artifacts

FlyLaTeX
Collaborative LaTeX Editor
CORELS
CORELS is a custom discrete optimization technique for building decision/rule lists over a categorical feature space.
See my GitHub page for some more code artifacts not listed here.

Teaching