Main Profile

At A Glance

NIPS 2011 Domain Adaptation Workshop: Overfitting and Small Sample Statistics

Domain Adaptation Workshop: Theory and Application at NIPS 2011 Invited Talk: Overfitting and Small Sample Statistics by Ruslan Salakhutdinov Abstract: We study the prevalent problem when a test distribution differs from the training distribution. We consider a setting where our training set consists of a small number of sample domains, but where we have many samples in each domain. Our goal is to generalize to a new domain. For example, we may want to learn a similarity function using only certain classes of objects, but we desire that this similarity function be applicable to object classes not present in our training sample (e.g. we might seek to learn that "dogs are similar to dogs" even though images of dogs were absent from our training set). Our theoretical analysis shows that we can select many more features than domains while avoiding overfitting by utilizing data-dependent variance properties. We present a greedy feature selection algorithm based on using T-statistics. Our experiments validate this theory showing that our T-statistic based greedy feature selection is more robust at avoiding overfitting than the classical greedy procedure.
Length: 15:47

Contact

Questions about NIPS 2011 Domain Adaptation Workshop: Overfitting and Small Sample Statistics

Want more info about NIPS 2011 Domain Adaptation Workshop: Overfitting and Small Sample Statistics? Get free advice from education experts and Noodle community members.

  • Answer

Ask a New Question