Main Profile

At A Glance

NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Block splitting for...

Big Learning Workshop: Algorithms, Systems, and Tools for Learning at Scale at NIPS 2011 Invited Talk: Block splitting for Large-Scale Distributed Learning by Neal Parikh Neal Parikh is a Ph.D. Candidate in the Department of Computer Science at Stanford University. Abstract: Machine learning and statistics with very large datasets is now a topic of widespread interest, both in academia and industry. Many such tasks can be posed as convex optimization problems, so algorithms for distributed convex optimization serve as a powerful, general-purpose mechanism for training a wide class of models on datasets too large to process on a single machine. In previous work, it has been shown how to solve such problems in such a way that each machine only looks at either a subset of training examples or a subset of features. In this paper, we extend these algorithms by showing how to split problems by both examples and features simultaneously, which is necessary to deal with datasets that are very large in both dimensions. We present some experiments with these algorithms run on Amazon's Elastic Compute Cloud.
Length: 25:24

Contact

Questions about NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Block splitting for...

Want more info about NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Block splitting for...? Get free advice from education experts and Noodle community members.

  • Answer

Ask a New Question