Skip to main content

previous disabled Page of 2
and
  1. Article

    Rethinking statistical learning theory: learning using statistical invariants

    This paper introduces a new learning paradigm, called Learning Using Statistical Invariants (LUSI), which is different from the classical one. In a classical paradigm, the learning machine constructs a classif...

    Vladimir Vapnik, Rauf Izmailov in Machine Learning (2019)

  2. No Access

    Article

    Knowledge transfer in SVM and neural networks

    The paper considers general machine learning models, where knowledge transfer is positioned as the main method to improve their convergence properties. Previous research was focused on mechanisms of knowledge ...

    Vladimir Vapnik, Rauf Izmailov in Annals of Mathematics and Artificial Intelligence (2017)

  3. No Access

    Chapter and Conference Paper

    Learning with Intelligent Teacher

    The paper considers several topics on learning with privileged information: (1) general machine learning models, where privileged information is positioned as the main mechanism to improve their convergence pr...

    Vladimir Vapnik, Rauf Izmailov in Conformal and Probabilistic Prediction with Applications (2016)

  4. No Access

    Chapter and Conference Paper

    Statistical Inference Problems and Their Rigorous Solutions

    This paper presents direct settings and rigorous solutions of Statistical Inference problems. It shows that rigorous solutions require solving ill-posed Fredholm integral equations of the first kind in the sit...

    Vladimir Vapnik, Rauf Izmailov in Statistical Learning and Data Sciences (2015)

  5. No Access

    Chapter and Conference Paper

    Learning with Intelligent Teacher: Similarity Control and Knowledge Transfer

    This paper introduces an advanced setting of machine learning problem in which an Intelligent Teacher is involved. During training stage, Intelligent Teacher provides Student with information that contains, al...

    Vladimir Vapnik, Rauf Izmailov in Statistical Learning and Data Sciences (2015)

  6. Article

    Open Access

    Falsificationism and Statistical Learning Theory: Comparing the Popper and Vapnik-Chervonenkis Dimensions

    We compare Karl Popper’s ideas concerning the falsifiability of a theory with similar notions from the part of statistical learning theory known as VC-theory. Popper’s notion of the dimension of a theory is contr...

    David Corfield, Bernhard Schölkopf in Journal for General Philosophy of Science (2009)

  7. Article

    Large margin vs. large volume in transductive learning

    We consider a large volume principle for transductive learning that prioritizes the transductive equivalence classes according to the volume they occupy in hypothesis space. We approximate volume maximization ...

    Ran El-Yaniv, Dmitry Pechyony, Vladimir Vapnik in Machine Learning (2008)

  8. Chapter and Conference Paper

    Large Margin vs. Large Volume in Transductive Learning

    We focus on distribution-free transductive learning. In this setting the learning algorithm is given a ‘full sample’ of unlabeled points. Then, a training sample is selected uniformly at random from the full samp...

    Ran El-Yaniv, Dmitry Pechyony in Machine Learning and Knowledge Discovery i… (2008)

  9. No Access

    Book

  10. No Access

    Chapter

    Methods of Expected-Risk Minimization

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  11. No Access

    Chapter

    Noninductive Methods of Inference: Direct Inference Instead of Generalization (2000–…)

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  12. No Access

    Chapter

    Estimation of Regression Parameters

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  13. No Access

    Chapter

    Realism and Instrumentalism: Classical Statistics and VC Theory (1960–1980)

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  14. No Access

    Chapter

    Methods of Parametric Statistics for the Problem of Regression Estimation

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  15. No Access

    Chapter

    A Method of Minimizing Empirical Risk for the Problem of Pattern Recognition

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  16. No Access

    Chapter

    The Method of Structural Minimization of Risk

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  17. No Access

    Chapter

    Estimation of Functional Values at Given Points

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  18. No Access

    Chapter

    The Problem of Estimating Dependences from Empirical Data

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  19. No Access

    Chapter

    Falsifiability and Parsimony: VC Dimension and the Number of Entities (1980–2000)

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

  20. No Access

    Chapter

    Methods of Parametric Statistics for the Pattern Recognition Problem

    Vladimir Vapnik in Estimation of Dependences Based on Empirical Data (2006)

previous disabled Page of 2