Cameron Musco

Stata Center. 32 Vassar Street, Cambridge, MA 02139. Office 32-G670.   cnmusco at mit dot edu

my photo

I am a third year Ph.D. student in the Theory of Computation Group at MIT's Computer Science and Artificial Intelligence Laboratory. I am lucky to be advised by Nancy Lynch and partially supported by an NSF Graduate Research Fellowship. I study algorithms and am especially interested in randomized linear algebraic computation and data analysis.

Before MIT, I studied Computer Science and Applied Mathematics at Yale University and worked as a web developer at Redfin.

Here are my Google Scholar profile and my rather sporadic and unfocused Quora profile.

Publications

Linear Algebra - Iterative Methods

Principal Component Projection Without Principal Component Analysis
Roy Frostig, Cameron Musco, Christopher Musco, and Aaron Sidford
International Conference on Machine Learning (ICML 2016)
Matlab Code: standard algorithm, faster Krylov subspace algorithm

Faster Eigenvector Computation via Shift-and-Invert Preconditioning
Elad Hazan, Daniel Garber, Chi Jin, Sham M. Kakade, Cameron Musco, Praneeth Netrapalli, and Aaron Sidford
International Conference on Machine Learning (ICML 2016)

Randomized Block Krylov Methods for Stronger and Faster Approximate Singular Value Decomposition
Cameron Musco and Christopher Musco
Conference on Neural Information Processing Systems (NIPS 2015)
Selected for Oral Presentation (1 of 15 out of 403 papers).
Slides from my talk at NIPS. Matlab code.

Linear Algebra - Sketching and Sampling Methods

Provably Useful Kernel Matrix Approximation in Linear Time
Cameron Musco and Christopher Musco
Preprint, 2016.
Matlab code (proof of concept implementation, not optimized for runtime).

Online Row Sampling
Michael B. Cohen, Cameron Musco, and Jakub Pachocki
International Workshop on Approximation Algorithms for Combinatorial Optimization Problems (APPROX 2016)

Ridge Leverage Scores for Low-Rank Approximation
Michael B. Cohen, Cameron Musco, and Christopher Musco
Preprint, 2015
Chris's Slides from his talk at University of Utah.

Dimensionality Reduction for k-Means Clustering and Low Rank Approximation
Michael B. Cohen, Sam Elder, Cameron Musco, Christopher Musco, and Madalina Persu
ACM Symposium on Theory of Computing (STOC 2015)
Slides from my talk at MIT's Algorithms and Complexity Seminar.
My Master's Thesis containing empirical evaluation of the dimensionality reduction techniques studied along with a guide to implementation.

Uniform Sampling for Matrix Approximation
Michael B. Cohen, Yin Tat Lee, Cameron Musco, Christopher Musco, Richard Peng, and Aaron Sidford
Innovations in Theoretical Computer Science (ITCS 2015)
Slides from my talk at MIT's Algorithms and Complexity Seminar.

Single Pass Spectral Sparsification in Dynamic Streams
Michael Kapralov, Yin Tat Lee, Cameron Musco, Christopher Musco and Aaron Sidford
IEEE Symposium on Foundations of Computer Science (FOCS 2014)
Invited to special issue of SIAM Journal on Computing
Chris's Slides from his talks at FOCS and the Harvard TOC Seminar.

Biological Distributed Algorithms

Ant-Inspired Density Estimation via Random Walks
Cameron Musco, Hsin-Hao Su, and Nancy Lynch
ACM Symposium on Principles of Distributed Computing (PODC 2016)

Distributed House-Hunting in Ant Colonies
Mohsen Ghaffari, Cameron Musco, Tsvetomira Radeva, and Nancy Lynch
ACM Symposium on Principles of Distributed Computing (PODC 2015)

Computational Tradeoffs in Biological Neural Networks: Self-Stabilizing Winner-Take-All Networks
Nancy Lynch, Cameron Musco, and Merav Parter
Preprint, 2016

Other Writing

Here are a few writeups, notes, and talks. Some are super basic, but sometimes that's good.

Chebyshev Polynomials in TCS and Algorithm Design, outline of a talk I gave at the MIT Theory student retreat on the many applications of Chebyshev polynomials to upper and lower bounds in Theoretical Computer Science.

Subspace Scores for Feature Selection in Computer Vision, final project report where we test our column sampling algorithm for k-means/PCA dimension reduction (found here) and explore its use as a general feature selection technique.

Applications of Linear Sketching to Distributed Computing, slides for a talk I gave at our Theory of Distributed Systems seminar. High level overview of linear sketching, recent work on k-means clustering and spectral sparsification, and applications to distributed data analysis.

Graph Sparsification and Dimensionality Reduction, final project report for Jelani Nelson's Algorithms for Big Data class.

Linear Regression and Pseudoinverse Cheatsheet, since there are a lot of ways to explain the pseudoinverse and sometimes I forget the details.

Big-O and Asymptotic Notation Cheatsheet, since sometimes I forget even this.

Fast Approximation of Maximum Flow using Electrical Flows, undergraduate Applied Mathematics senior project report. I was fortunate to be advised by Dan Spielman for both my senior projects. They were my first introduction to research in computer science and the reason I decided to go to graduate school.

Graph Construction Through Laplacian Function Optimization, Computer Science senior project report.

Projects

I've worked on a lot of projects, some more serious than others.

I built this Rap Collaboration Graph, which gives a visualization of musical collaborations in hip hop. Unfortunately, colors are only rendering in Safari right now, it looks fuzzy on retina displays, and the data is about a year out of date. But I promise I'll get to it.

I had a lot of fun helping build the first version of a site that sold buffalo chicken sandwiches. It's gotten a major facelift and evolved into Crunchbutton, but here is a screenshot of the original One Button Wenzel in its glory

My friend Charlie and I once built an AI to play Transport Tycoon Deluxe. Here is a poster describing the project.

In college I also had a lot of fun working on Yale's Formula Hydrid Racecar Team.

I love to ski, and a long time ago, my brother Chris and I used to hand build custom skis. Our first pair had maple/poplar cores, varnished wood sidewalls, and flex comparable to a pair of 2x4s. Our second iteration has more precisely shaped pine cores, a smoother top sheet, and a much smoother flex. The tips delaminated once, but we riveted them together and still ski on them today! Here's an old forum post on SkiBuilders.com with more pictures of our setup and first pair.