CART

(0) items

Statistical Analysis Techniques in Particle Physics Fits, Density Estimation and Supervised Learning,9783527410866
This item qualifies for
FREE SHIPPING!

FREE SHIPPING OVER $59!

Your order must be $59 or more, you must select US Postal Service Shipping as your shipping preference, and the "Group my items into as few shipments as possible" option when you place your order.

Bulk sales, PO's, Marketplace Items, eBooks, Apparel, and DVDs not included.

Statistical Analysis Techniques in Particle Physics Fits, Density Estimation and Supervised Learning

by ;
Edition:
1st
ISBN13:

9783527410866

ISBN10:
3527410864
Format:
Paperback
Pub. Date:
12/23/2013
Publisher(s):
Wiley-VCH
List Price: $106.61

Rent Textbook

(Recommended)
 
Term
Due
Price
$95.95

Buy New Textbook

Currently Available, Usually Ships in 24-48 Hours
N9783527410866
$102.28

Used Textbook

We're Sorry
Sold Out

eTextbook

We're Sorry
Not Available

More New and Used
from Private Sellers
Starting at $90.74
See Prices

Questions About This Book?

Why should I rent this book?
Renting is easy, fast, and cheap! Renting from eCampus.com can save you hundreds of dollars compared to the cost of new or used books each semester. At the end of the semester, simply ship the book back to us with a free UPS shipping label! No need to worry about selling it back.
How do rental returns work?
Returning books is as easy as possible. As your rental due date approaches, we will email you several courtesy reminders. When you are ready to return, you can print a free UPS shipping label from our website at any time. Then, just return the book to your UPS driver or any staffed UPS location. You can even use the same box we shipped it in!
What version or edition is this?
This is the 1st edition with a publication date of 12/23/2013.
What is included with this book?
  • The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any CDs, lab manuals, study guides, etc.
  • The Rental copy of this book is not guaranteed to include any supplemental materials. You may receive a brand new copy, but typically, only the book itself.

Summary

Modern analysis of HEP data needs advanced statistical tools to separate signal from background. This is the first book which focuses on machine learning techniques. It will be of interest to almost every high energy physicist, and, due to its coverage, suitable for students.

Author Biography

The authors are experts in the use of statistics in particle physics data analysis. Frank C. Porter is Professor at Physics at the California Institute of Technology and has lectured extensively at Cal Tech, the SLAC Laboratory at Standford, and elsewhere. Ilya Narsky is Senior Matlab Developer at The MathWorks, a leading developer of technical computing software for engineers and scientists, and the initiator of the StatPatternRecognition, a C++ package for statistical analysis of HEP data. Together, they have taught courses for graduate students and postdocs.

Table of Contents

PARAMETRIC UNBINNED LIKELIHOOD FITS
Fits for Small Statistics
Fits Near the Boundary of the Physical Region
Likelihood Ratio Test for Presence of Signal
5-Plots
GOODNESS OF FIT PROBLEM
Binned Goodness-of-Fit Tests
Statistics Converging to Chi-Square
Univariate Unbinned Goodness-of-Fit Tests: Kolmogorov-Smirnov, Anderson-Darling, Watson, and Neyman Smooth Tests
Multivariate Tests
RESAMPLING TECHNIQUES
Jackknife, Bootstrap and Cross-Validation
Choice of the Optima Resampling Method: Bias, Variance and the Learning Curve
Resampling Weighted Observations
NON-PARAMETRIC DENSITY ESTIMATION
Equal-Bin and Adaptive Histograms
Optimal Binning
Density Estimation by Kernels
Optimal Kernel Size
The Curse of Dimensionality
BASIC CONCEPTS AND DEFINITIONS OF MACHINE LEARNING
Supervised, Unsupervised and Semi-Supervised Learning
Batch and Online Learning
Sequential and Parallel Learning
Classification and Regression
Training, Validation and Test
Categorical Variables
Missing Values
DATA PRE-PROCESSING
Linear Transformations and Dimensionality Reduction
Principal and Independent Component Analysis
Partial Least Squares
INTRODUCTION TO CLASSIFICATION
Forms of Classification Loss
Perfect Classifier to Bayes 0-1 Loss
Bias and Variance of a Classifier
Data with Unbalanced Classes and Unequal Misclassification Costs
MONITORING CLASSIFIER PERFORMANCE
ROC and Other Performance Curves
Confidence Bounds for ROC Curves
Testing if One Classifier Outperforms Another
Comparison Across Multiple Classifiers
LINEAR AND QUADRATIC DISCRIMINANT ANALYSIS
Testing Multivariate Normality
Logistic Regression for Data with Two Classes
BUMP HUNTING IN HIGH-DIMENSIONAL DATA
Search for Rectangular Regions by High-Dimensional Optimization and PRIM Algorithms
Voronoi Tessellation and SLEUTH Algorithm
NEURAL NETWORKS
Back-Propagation
Activation Functions
Optimization of Neural Nets by Genetic Algorithms
LOCAL METHODS
Nearest Neighbors
Radial Basis Functions, Thin-Plate Splines, Regularized Least Squares, and Support Vector Machines (SVM)
Multiclass Extensions of SVM
The Curse of Dimensionality
DECISION TREES
Splitting Criteria
Binary and Multiway Splits
Pruningdecision Trees
Surrogate Splits and Their Uses
ENSEMBLE METHODS
Boostin: AdaBoostM1 Algorithm
Boosting Multiclass Learners
Bagging and Random Forest
Boosting as Fitting a Stagewise Additive Model
Convex Loss Functions and Label Noise
Pruning (Post-Fitting) Ensembles
UNIFIED APPROACH FOR REDUCING MULTICLASS TO BINARY
Error Correcting Output Code
COMBINING CLASSIFIERS
Trainable Combiners
Mixture of Experts
Optimallinear Combination of Classifiers
Stacked Generalization
METHODS FOR VARIABLE SELECTION
Filters, Wrappers and Embedded Methods
Relevance and Redundancy of Variables
Variable Ranking and Optimal Subset Selection
Generalzed Sequential Forward Addition and Backward Elimination
Variable Importance Using Nearest Neighbors (ReliefF Algorithm)
Variable Importance from Randomized Subsets
Variable Importance from Decision Trees
SURVEY OF SOFTWARE PACKAGES FOR MACHINE LEARNING



Please wait while the item is added to your cart...