Note: Supplemental materials are not guaranteed with Rental or Used book purchases.
Purchase Benefits
Looking to rent a book? Rent Semi-supervised Learning [ISBN: 9780262514125] for the semester, quarter, and short term or search our site for other textbooks by Chapelle, Olivier; Scholkopf, Bernhard; Zien, Alexander. Renting a textbook can save you up to 90% from the cost of buying.
Series Foreword | p. xi |
Preface | p. xiii |
Introduction to Semi-Supervised Learning | p. 1 |
Supervised, Unsupervised, and Semi-Supervised Learning | p. 1 |
When Can Semi-Supervised Learning Work? | p. 4 |
Classes of Algorithms and Organization of This Book | p. 8 |
Generative Models | |
A Taxonomy for Semi-Supervised Learning Methods | p. 15 |
The Semi-Supervised Learning Problem | p. 15 |
Paradigms for Semi-Supervised Learning | p. 17 |
Examples | p. 22 |
Conclusions | p. 31 |
Semi-Supervised Text Classification Using EM | p. 33 |
Introduction | p. 33 |
A Generative Model for Text | p. 35 |
Experminental Results with Basic EM | p. 41 |
Using a More Expressive Generative Model | p. 43 |
Overcoming the Challenges of Local Maxima | p. 49 |
Conclusions and Summary | p. 54 |
Risks of Semi-Supervised Learning | p. 57 |
Do Unlabled Data Improve or Degrade Classification Performance? | p. 57 |
Understanding Unlabeled Data: Asymptotic Bias | p. 59 |
The Asymptotic Analysis of Generative Smei-Supervised Learning | p. 63 |
The Value of Labeled and Unlabeled Data | p. 67 |
Finite Sample Effects | p. 69 |
Model Search and Robustness | p. 70 |
Conclusion | p. 71 |
Probabilistic Semi-Supervised Cluster with Constraints | p. 73 |
Introduction | p. 74 |
HMRF Model for Semi-Supervised Clustering | p. 75 |
HMRF-KMeans Algorithm | p. 81 |
Active Learning for Constraint Acquistion | p. 93 |
Experimental Results | p. 96 |
Related Work | p. 100 |
Conclusions | p. 101 |
Low-Density Separation | |
Transductive Support Vector Machines | p. 105 |
Introduction | p. 105 |
Transductive Support Vector Machines | p. 108 |
Why Use Margin on the Test Set? | p. 111 |
Experiments and Applications of the TSVMs | p. 112 |
Solving the TSVM Optimization Problem | p. 114 |
Connection to Related Approaches | p. 116 |
Summary and Conclusions | p. 116 |
Semi-Supervised Learning Using Semi-Definite Programming | p. 119 |
Relaxing SVM transduction | p. 119 |
An Approximation for Speedup | p. 126 |
General Semi-Supervised Learning Settings | p. 128 |
Empirical Results | p. 129 |
Summary and Outlook | p. 133 |
Appendix: The Extended Schur Complement Lemma | p. 134 |
Gaussian Processes and the Null-Category Noise Model | p. 137 |
Introduction | p. 137 |
The Noise Model | p. 141 |
Process Model and the Effect of the Null-Category | p. 143 |
Posterior Inference and Prediction | p. 145 |
Results | p. 147 |
Discussion | p. 149 |
Entropy Regularization | p. 151 |
Introduction | p. 151 |
Derivation of the Criterion | p. 152 |
Optimization Algorithms | p. 155 |
Related Methods | p. 158 |
Experiments | p. 160 |
Conclusion | p. 166 |
Appendix: Proof of Theorem 9.1 | p. 166 |
Data-Dependent Regularization | p. 169 |
Introduction | p. 169 |
Information Regularization on Metric Spaces | p. 174 |
Information Regularization and Relational Data | p. 182 |
Discussion | p. 189 |
Graph-Based Models | |
Label Propogation and Quadratic Criterion | p. 193 |
Introduction | p. 193 |
Label Propogation on a Similarity Graph | p. 194 |
Quadratic Cost Criterion | p. 198 |
From Transduction to Induction | p. 205 |
Incorporating Class Prior Knowledge | p. 205 |
Curse of Dimensionality for Semi-Supervised Learning | p. 206 |
Discussion | p. 215 |
The Geometric Basis of Semi-Supervised Learning | p. 217 |
Introduction | p. 217 |
Incorporating Geometry in Regularization | p. 220 |
Algorithms | p. 224 |
Data-Dependent Kernels for Semi-Supervised Learning | p. 229 |
Linear Methods for Large-Scale Semi-Supervised Learning | p. 231 |
Connections to Other Algorithms and Related Work | p. 232 |
Future Directions | p. 234 |
Discrete Regularization | p. 237 |
Introduction | p. 237 |
Discrete Analysis | p. 239 |
Discrete Regularization | p. 245 |
Conclusion | p. 249 |
Semi-Supervised Learning with Conditional Harmonic Mixing | p. 251 |
Introduction | p. 251 |
Conditional Harmonic Mixing | p. 255 |
Learning in CHM Models | p. 256 |
Incorporating Prior Knowledge | p. 261 |
Learning the Conditionals | p. 261 |
Model Averaging | p. 262 |
Experiments | p. 263 |
Conclusions | p. 273 |
Change of Representation | |
Graph Kernels by Spectral Transforms | p. 277 |
The Graph Laplacian | p. 278 |
Kernels by Spectral Transforms | p. 280 |
Kernel Alignment | p. 281 |
Optimizing Alignment Using QCQP for Semi-Supervised Learning | p. 282 |
Semi-Supervised Kernels with Order Restraints | p. 283 |
Experimental Results | p. 285 |
Conclusion | p. 289 |
Spectral Methods for Dimensionality Reduction | p. 293 |
Introduction | p. 293 |
Linear Methods | p. 295 |
Graph-Based Methods | p. 297 |
Kernel Methods | p. 303 |
Discussion | p. 306 |
Modifying Distances | p. 309 |
Introduction | p. 309 |
Estimating DBD Metrics | p. 312 |
Computing DBD Metrics | p. 321 |
Semi-Supervised Learning Using Density-Based Metrics | p. 327 |
Conclusions and Future Work | p. 329 |
Semi-Supervised Learning in Practice | |
Large-Scale Algorithms | p. 333 |
Introduction | p. 333 |
Cost Approximations | p. 334 |
Subset Selection | p. 337 |
Discussion | p. 340 |
Semi-Supervised Protein Classification Using Cluster Kernels | p. 343 |
Introduction | p. 343 |
Representation and Kernels for Protein Sequences | p. 345 |
Semi-Supervised Kernels for Protein Sequences | p. 348 |
Experiments | p. 352 |
Discussion | p. 358 |
Prediction of Protein Function from Networks | p. 361 |
Introduction | p. 361 |
Graph-Based Semi-Supervised Learning | p. 364 |
Combining Multiple Graphs | p. 366 |
Experiments on Function Prediction of Proteins | p. 369 |
Conclusion and Outlook | p. 374 |
Analysis of Benchmarks | p. 377 |
The Benchmark | p. 377 |
Application of SSL Methods | p. 383 |
Results and Discussion | p. 390 |
Perspectives | |
An Augmented PAC Model for Semi-Supervised Learning | p. 397 |
Introduction | p. 398 |
A Formal Framework | p. 400 |
Sample Complexity Results | p. 403 |
Algorithmic Results | p. 412 |
Related Models and Discussion | p. 416 |
Metric-Based Approaches for Semi-Supervised Regression and Classification | p. 421 |
Introduction | p. 421 |
Metric Structure of Supervised Learning | p. 423 |
Model Selection | p. 426 |
Regularization | p. 436 |
Classification | p. 445 |
Conclusion | p. 449 |
Transductive Inference and Semi-Supervised Learning | p. 453 |
Problem Settings | p. 453 |
Problem of Generalization in Inductive and Transductive Inference | p. 455 |
Structure of the VC Bounds and Transductive Inference | p. 457 |
The Symmetrization Lemma and Transductive Inference | p. 458 |
Bounds for Transductive Inference | p. 459 |
The Structural Risk Minimization Principle for Induction and Transduction | p. 460 |
Combinatorics in Transductive Inference | p. 462 |
Measures of Size of Equivalence Classes | p. 463 |
Algorithms for Inductive and Transductive SVMs | p. 465 |
Semi-Supervised Learning | p. 470 |
Conclusion: Transductive Inference and the New Problems of Inference | p. 470 |
Beyond Transduction: Selective Inference | p. 471 |
A Discussion of Semi-Supervised Learning and Transduction | p. 473 |
References | p. 479 |
Notation and Symbols | p. 499 |
Contributors | p. 503 |
Index | p. 509 |
Online Index | |
Table of Contents provided by Publisher. All Rights Reserved. |
The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.