Skip to main content

previous disabled Page of 2
and
  1. Article

    Author Correction: Learnability can be undecidable

    In the version of this Article originally published, the following text was missing from the Acknowledgements: ‘Part of the research was done while S.M. was at the Institute for Advanced Study in Princeton and...

    Shai Ben-David, Pavel Hrubeš, Shay Moran, Amir Shpilka in Nature Machine Intelligence (2019)

  2. No Access

    Article

    Learnability can be undecidable

    The mathematical foundations of machine learning play a key role in the development of the field. They improve our understanding and provide tools for designing new learning paradigms. The advantages of mathem...

    Shai Ben-David, Pavel Hrubeš, Shay Moran, Amir Shpilka in Nature Machine Intelligence (2019)

  3. No Access

    Chapter and Conference Paper

    On Version Space Compression

    We study compressing labeled data samples so as to maintain version space information. While classic compression schemes [11] only ask for recovery of a samples’ labels, many applications, such as distributed lea...

    Shai Ben-David, Ruth Urner in Algorithmic Learning Theory (2016)

  4. No Access

    Chapter and Conference Paper

    Finding Meaningful Cluster Structure Amidst Background Noise

    We consider efficient clustering algorithm under data clusterability assumptions with added noise. In contrast with most literature on this topic that considers either the adversarial noise setting or some noi...

    Shrinu Kushagra, Samira Samadi, Shai Ben-David in Algorithmic Learning Theory (2016)

  5. No Access

    Chapter and Conference Paper

    Multi-task and Lifelong Learning of Kernels

    We consider a problem of learning kernels for use in SVM classification in the multi-task and lifelong scenarios and provide generalization bounds on the error of a large margin classifier. Our results show th...

    Anastasia Pentina, Shai Ben-David in Algorithmic Learning Theory (2015)

  6. No Access

    Chapter and Conference Paper

    Information Preserving Dimensionality Reduction

    Dimensionality reduction is a very common preprocessing approach in many machine learning tasks. The goal is to design data representations that on one hand reduce the dimension of the data (therefore allowing...

    Shrinu Kushagra, Shai Ben-David in Algorithmic Learning Theory (2015)

  7. No Access

    Article

    Domain adaptation–can quantity compensate for quality?

    The Domain Adaptation problem in machine learning occurs when the distribution generating the test data differs from the one that generates the training data. A common approach to this issue is to train a stan...

    Shai Ben-David, Ruth Urner in Annals of Mathematics and Artificial Intelligence (2014)

  8. No Access

    Chapter and Conference Paper

    On the Hardness of Domain Adaptation and the Utility of Unlabeled Target Samples

    The Domain Adaptation problem in machine learning occurs when the test and training data generating distributions differ. We consider the covariate shift setting, where the labeling function is the same in bot...

    Shai Ben-David, Ruth Urner in Algorithmic Learning Theory (2012)

  9. No Access

    Chapter and Conference Paper

    Learning a Classifier when the Labeling Is Known

    We introduce a new model of learning, Known-Labeling-Classifier-Learning (KLCL). The goal of such learning is to find a low-error classifier from some given target-class of predictors, when the correct labeling i...

    Shalev Ben-David, Shai Ben-David in Algorithmic Learning Theory (2011)

  10. Article

    Open Access

    A theory of learning from different domains

    Discriminative learning methods for classification perform well when training and test data are drawn from the same distribution. Often, however, we have plentiful labeled training data from a source domain but w...

    Shai Ben-David, John Blitzer, Koby Crammer, Alex Kulesza in Machine Learning (2010)

  11. Chapter and Conference Paper

    Theory-Practice Interplay in Machine Learning – Emerging Theoretical Challenges

    Theoretical analysis has played a major role in some of the most prominent practical successes of statistical machine learning. However, mainstream machine learning theory assumes some strong simplifying assum...

    Shai Ben-David in Machine Learning and Knowledge Discovery in Databases (2009)

  12. Article

    A notion of task relatedness yielding provable multiple-task learning guarantees

    The approach of learning multiple “related” tasks simultaneously has proven quite successful in practice; however, theoretical justification for this success has remained elusive. The starting point for previ...

    Shai Ben-David, Reba Schuller Borbely in Machine Learning (2008)

  13. Article

    A framework for statistical clustering with constant time approximation algorithms for K-median and K-means clustering

    We consider a framework of sample-based clustering. In this setting, the input to a clustering algorithm is a sample generated i.i.d by some unknown arbitrary distribution. Based on such a sample, the algorithm h...

    Shai Ben-David in Machine Learning (2007)

  14. No Access

    Chapter and Conference Paper

    Stability of k-Means Clustering

    We consider the stability of k-means clustering problems. Clustering stability is a common heuristics used to determine the number of clusters in a wide variety of clustering applications. We continue the theoret...

    Shai Ben-David, Dávid Pál, Hans Ulrich Simon in Learning Theory (2007)

  15. No Access

    Chapter and Conference Paper

    Alternative Measures of Computational Complexity with Applications to Agnostic Learning

    We address a fundamental problem of complexity theory – the inadequacy of worst-case complexity for the task of evaluating the computational resources required for real life problems. While being the best know...

    Shai Ben-David in Theory and Applications of Models of Computation (2006)

  16. No Access

    Chapter and Conference Paper

    A Sober Look at Clustering Stability

    Stability is a common tool to verify the validity of sample based algorithms. In clustering it is widely used to tune the parameters of the algorithm, such as the number k of clusters. In spite of the popularity ...

    Shai Ben-David, Ulrike von Luxburg, Dávid Pál in Learning Theory (2006)

  17. No Access

    Chapter and Conference Paper

    Learning Bounds for Support Vector Machines with Learned Kernels

    Consider the problem of learning a kernel for use in SVM classification. We bound the estimation error of a large margin classifier when the kernel, relative to which this margin is defined, is chosen from a f...

    Nathan Srebro, Shai Ben-David in Learning Theory (2006)

  18. No Access

    Chapter and Conference Paper

    A Framework for Statistical Clustering with a Constant Time Approximation Algorithms for K-Median Clustering

    We consider a framework in which the clustering algorithm gets as input a sample generated i.i.d by some unknown arbitrary distribution, and has to output a clustering of the full domain set, that is evaluated...

    Shai Ben-David in Learning Theory (2004)

  19. No Access

    Chapter and Conference Paper

    Exploiting Task Relatedness for Multiple Task Learning

    The approach of learning of multiple “related” tasks simultaneously has proven quite successful in practice; however, theoretical justification for this success has remained elusive. The starting point for pre...

    Shai Ben-David, Reba Schuller in Learning Theory and Kernel Machines (2003)

  20. No Access

    Chapter and Conference Paper

    Agnostic Boosting

    We extend the boosting paradigm to the realistic setting of agnostic learning, that is, to a setting where the training sample is generated by an arbitrary (unknown) probability distribution over examples and ...

    Shai Ben-David, Philip M. Long, Yishay Mansour in Computational Learning Theory (2001)

previous disabled Page of 2