rent-now

Rent More, Save More! Use code: ECRENTAL

5% off 1 book, 7% off 2 books, 10% off 3+ books

9780130925534

Applied Multivariate Statistical Analysis

by ;
  • ISBN13:

    9780130925534

  • ISBN10:

    0130925535

  • Edition: 5th
  • Format: Hardcover
  • Copyright: 2008-01-01
  • Publisher: Pearson College Div
  • View Upgraded Edition

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $134.40 Save up to $33.60
  • Buy Used
    $100.80
    Add to Cart Free Shipping Icon Free Shipping

    USUALLY SHIPS IN 2-4 BUSINESS DAYS

Summary

This market-leading book offers a readable introduction to the statistical analysis of multivariate observations. Its overarching goal is to provide readers with the knowledge necessary to make proper interpretations and select appropriate techniques for analyzing multivariate data.Chapter topics include aspects of multivariate analysis, matrix algebra and random vectors, sample geometry and random sampling, the multivariate normal distribution, inferences about a mean vector, comparisons of several multivariate means, multivariate linear regression models, principal components, factor analysis and inference for structured covariance matrices, canonical correlation analysis, and discrimination and classification.For experimental scientists in a variety of disciplines.

Table of Contents

Preface xv
Aspects of Multivariate Analysis
1(49)
Introduction
1(2)
Applications of Multivariate Techniques
3(2)
The Organization of Data
5(14)
Arrays
5(1)
Descriptive Statistics
6(5)
Graphical Techniques
11(8)
Data Displays and Pictorial Representations
19(11)
Linking Multiple Two-Dimensional Scatter Plots
20(4)
Graphs of Growth Curves
24(1)
Stars
25(3)
Chernoff Faces
28(2)
Distance
30(8)
Final Comments
38(12)
Exercises
38(10)
References
48(2)
Matrix Algebra and Random Vectors
50(62)
Introduction
50(1)
Some Basics of Matrix and Vector Algebra
50(11)
Vectors
50(5)
Matrices
55(6)
Positive Definite Matrices
61(5)
A Square-Root Matrix
66(1)
Random Vectors and Matrices
67(1)
Mean Vectors and Covariance Matrices
68(11)
Partitioning the Covariance Matrix
74(2)
The Mean Vector and Covariance Matrix for Linear Combinations of Random Variables
76(2)
Partitioning the Sample Mean Vector and Covariance Matrix
78(1)
Matrix Inequalities and Maximization
79(33)
Supplement 2A: Vectors and Matrices: Basic Concepts
84(1)
Vectors
84(5)
Matrices
89(15)
Exercises
104(7)
References
111(1)
Sample Geometry and Random Sampling
112(37)
Introduction
112(1)
The Geometry of the Sample
112(8)
Random Samples and the Expected Values of the Sample Mean and Covariance Matrix
120(4)
Generalized Variance
124(15)
Situations in which the Generalized Sample Variance Is Zero
130(6)
Generalized Variance Determined by /R/ and Its Geometrical Interpretation
136(2)
Another Generalization of Variance
138(1)
Sample Mean, Covariance, and Correlation As Matrix Operations
139(2)
Sample Values of Linear Combinations of Variables
141(8)
Exercises
145(3)
References
148(1)
The Multivariate Normal Distribution
149(61)
Introduction
149(1)
The Multivariate Normal Density and Its Properties
149(19)
Additional Properties of the Multivariate Normal Distribution
156(12)
Sampling from a Multivariate Normal Distribution and Maximum Likelihood Estimation
168(5)
The Multivariate Normal Likelihood
168(2)
Maximum Likelihood Estimation of μ and Σ
170(3)
Sufficient Statistics
173(1)
The Sampling Distribution of X and S
173(2)
Properties of the Wishart Distribution
174(1)
Large-Sample Behavior of X and S
175(2)
Assessing the Assumption of Normality
177(12)
Evaluating the Normality of the Univariate Marginal Distributions
178(5)
Evaluating Bivariate Normality
183(6)
Detecting Outliers and Cleaning Data
189(5)
Steps for Detecting Outliers
190(4)
Transformations To Near Normality
194(16)
Transforming Multivariate Observations
198(4)
Exercises
202(7)
References
209(1)
Inferences About a Mean Vector
210(62)
Introduction
210(1)
The Plausibility of μ0 as a Value for a Normal Population Mean
210(6)
Hotelling's T2 and Likelihood Ratio Tests
216(4)
General Likelihood Ratio Method
219(1)
Confidence Regions and Simultaneous Comparisons of Component Means
220(14)
Simultaneous Confidence Statements
223(6)
A Comparison of Simultaneous Confidence Intervals with One-at-a-Time Intervals
229(3)
The Bonferroni Method of Multiple Comparisons
232(2)
Large Sample Inferences about a Population Mean Vector
234(5)
Multivariate Quality Control Charts
239(13)
Charts for Monitoring a Sample of Individual Multivariate Observations for Stability
241(6)
Control Regions for Future Individual Observations
247(1)
Control Ellipse for Future Observations
248(1)
T2-Chart for Future Observations
248(1)
Control Charts Based on Subsample Means
249(2)
Control Regions for Future Subsample Observations
251(1)
Inferences about Mean Vectors when Some Observations Are Missing
252(4)
Difficulties Due to Time Dependence in Multivariate Observations
256(16)
Supplement 5A: Simultaneous Confidence Intervals and Ellipses as Shadows of the p-Dimensional Ellipsoids
258(2)
Exercises
260(10)
References
270(2)
Comparisons of Several Multivariate Means
272(82)
Introduction
272(1)
Paired Comparisons and a Repeated Measures Design
272(11)
Paired Comparisons
272(6)
A Repeated Measures Design for Comparing Treatments
278(5)
Comparing Mean Vectors from Two Populations
283(10)
Assumptions Concerning the Structure of the Data
283(1)
Further Assumptions when n1 and n2 Are Small
284(3)
Simultaneous Confidence Intervals
287(3)
The Two-Sample Situation when Σ1 ≠ Σ2
290(3)
Comparing Several Multivariate Population Means (One-Way Manova)
293(12)
Assumptions about the Structure of the Data for One-way MANOVA
293(1)
A Summary of Univariate ANOVA
293(5)
Multivariate Analysis of Variance (MANOVA)
298(7)
Simultaneous Confidence Intervals for Treatment Effects
305(2)
Two-Way Multivariate Analysis of Variance
307(11)
Univariate Two-Way Fixed-Effects Model with Interaction
307(2)
Multivariate Two-Way Fixed-Effects Model with Interaction
309(9)
Profile Analysis
318(5)
Repeated Measures Designs and Growth Curves
323(4)
Perspectives and a Strategy for Analyzing Multivariate Models
327(27)
Exercises
332(20)
References
352(2)
Multivariate Linear Regression Models
354(72)
Introduction
354(1)
The Classical Linear Regression Model
354(4)
Least Squares Estimation
358(7)
Sum-of-Squares Decomposition
360(1)
Geometry of Least Squares
361(2)
Sampling Properties of Classical Least Squares Estimators
363(2)
Inferences About the Regression Model
365(9)
Inferences Concerning the Regression Parameters
365(5)
Likelihood Ratio Tests for the Regression Parameters
370(4)
Inferences from the Estimated Regression Function
374(3)
Estimating the Regression Function at z0
374(1)
Forecasting a New Observation at z0
375(2)
Model Checking and Other Aspects of Regression
377(6)
Does the Model Fit?
377(3)
Leverage and Influence
380(1)
Additional Problems in Linear Regression
380(3)
Multivariate Multiple Regression
383(15)
Likelihood Ratio Tests for Regression Parameters
392(3)
Other Multivariate Test Statistics
395(1)
Predictions from Multivariate Multiple Regressions
395(3)
The Concept of Linear Regression
398(9)
Prediction of Several Variables
403(3)
Partial Correlation Coefficient
406(1)
Comparing the Two Formulations of the Regression Model
407(3)
Mean Corrected Form of the Regression Model
407(2)
Relating the Formulations
409(1)
Multiple Regression Models with Time Dependent Errors
410(16)
Supplement 7A: The Distribution of the Likelihood Ratio for the Multivariate Multiple Regression Model
415(2)
Exercises
417(7)
References
424(2)
Principal Components
426(51)
Introduction
426(1)
Population Principal Components
426(11)
Principal Components Obtained from Standardized Variables
432(3)
Principal Components for Covariance Matrices with Special Structures
435(2)
Summarizing Sample Variation by Principal Components
437(13)
The Number of Principal Components
440(4)
Interpretation of the Sample Principal Components
444(1)
Standardizing the Sample Principal Components
445(5)
Graphing the Principal Components
450(2)
Large Sample Inferences
452(3)
Large Sample Properties of λi and ei
452(1)
Testing for the Equal Correlation Structure
453(2)
Monitoring Quality with Principal Components
455(22)
Checking a Given Set of Measurements for Stability
455(4)
Controlling Future Values
459(3)
Supplement 8A: The Geometry of the Sample Principal Component Approximation
462(2)
The p-Dimensional Geometrical Interpretation
464(1)
The n-Dimensional Geometrical Interpretation
465(1)
Exercises
466(9)
References
475(2)
Factor Analysis and Inference for Structured Covariance Matrices
477(65)
Introduction
477(1)
The Orthogonal Factor Model
478(6)
Methods of Estimation
484(17)
The Principal Component (and Principal Factor) Method
484(6)
A Modified Approach---the Principal Factor Solution
490(2)
The Maximum Likelihood Method
492(6)
A Large Sample Test for the Number of Common Factors
498(3)
Factor Rotation
501(9)
Oblique Rotations
509(1)
Factor Scores
510(7)
The Weighted Least Squares Method
511(2)
The Regression Method
513(4)
Perspectives and a Strategy for Factor Analysis
517(7)
Structural Equation Models
524(18)
The LISREL Model
525(1)
Construction of a Path Diagram
525(1)
Covariance Structure
526(1)
Estimation
527(2)
Model-Fitting Strategy
529(1)
Supplement 9A: Some Computational Details for Maximum Likelihood Estimation
530(1)
Recommended Computational Scheme
531(1)
Maximum Likelihood Estimators of
532(1)
Exercises
533(8)
References
541(1)
Canonical Correlation Analysis
542(39)
Introduction
543(1)
Canonical Variates and Canonical Correlations
543(8)
Interpreting the Population Canonical Variables
551(5)
Identifying the Canonical Variables
551(2)
Canonical Correlations as Generalizations of Other Correlation Coefficients
553(1)
The First r Canonical Variables as a Summary of Variability
554(1)
A Geometrical Interpretation of the Population Canonical Correlation Analysis
555(1)
The Sample Canonical Variates and Sample Canonical Correlations
556(8)
Additional Sample Descriptive Measures
564(5)
Matrices of Errors of Approximations
564(3)
Proportions of Explained Sample Variance
567(2)
Large Sample Inferences
569(12)
Exercises
573(7)
References
580(1)
Discrimination and Classification
581(87)
Introduction
581(1)
Separation and Classification for Two Populations
582(8)
Classification with Two Multivariate Normal Populations
590(8)
Classification of Normal Populations When Σ1 = Σ2 = Σ
590(5)
Scaling
595(1)
Classification of Normal Populations When Σ1 ≠ Σ2
596(2)
Evaluating Classification Functions
598(11)
Fisher's Discriminant Function---Separation of Populations
609(3)
Classification with Several Populations
612(16)
The Minimum Expected Cost of Misclassification Method
613(3)
Classification with Normal Populations
616(12)
Fisher's Method for Discriminating among Several Populations
628(13)
Using Fisher's Discriminants to Classify Objects
635(6)
Final Comments
641(27)
Including Qualitative Variables
641(1)
Classification Trees
641(3)
Neural Networks
644(1)
Selection of Variables
645(1)
Testing for Group Differences
645(1)
Graphics
646(1)
Practical Considerations Regarding Multivariate Normality
646(1)
Exercises
647(19)
References
666(2)
Clustering, Distance Methods, and Ordination
668(80)
Introduction
668(2)
Similarity Measures
670(9)
Distances and Similarity Coefficients for Pairs of Items
670(6)
Similarities and Association Measures for Pairs of Variables
676(1)
Concluding Comments on Similarity
677(2)
Hierarchical Clustering Methods
679(15)
Single Linkage
681(4)
Complete Linkage
685(4)
Average Linkage
689(1)
Ward's Hierarchical Clustering Method
690(3)
Final Comments---Hierarchical Procedures
693(1)
Nonhierarchical Clustering Methods
694(6)
K-means Method
694(4)
Final Comments---Nonhierarchical Procedures
698(2)
Multidimensional Scaling
700(9)
The Basic Algorithm
700(9)
Correspondence Analysis
709(10)
Algebraic Development of Correspondence Analysis
711(7)
Inertia
718(1)
Interpretation in Two Dimensions
719(1)
Final Comments
719(1)
Biplots for Viewing Sampling Units and Variables
719(4)
Constructing Biplots
720(3)
Procrustes Analysis: A Method for Comparing Configurations
723(25)
Constructing the Procrustes Measure of Agreement
724(7)
Supplement 12A: Data Mining
731(1)
Introduction
731(1)
The Data Mining Process
732(1)
Model Assessment
733(5)
Exercises
738(7)
References
745(3)
Appendix 748(10)
Data Index 758(3)
Subject Index 761

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Excerpts

INTENDED AUDIENCE This book originally grew out of our lecture notes for an "Applied Multivariate Analysis" course offered jointly by the Statistics Department and the School of Business at the University of Wisconsin-Madison.Applied Multivariate Statistical Analysis,Fifth Edition, is concerned with statistical methods for describing and analyzing multivariate data. Data analysis, while interesting with one variable, becomes truly fascinating and challenging when several variables are involved. Researchers in the biological, physical, and social sciences frequently collect measurements on several variables. Modern computer packages readily provide the numerical results to rather complex statistical analyses. We have "tried to provide readers with the supporting knowledge necessary for making proper interpretations, selecting appropriate techniques, and understanding their strengths and weaknesses. We hope our discussions will meet the needs of experimental scientists, in a wide variety of subject matter areas, as a readable introduction to the statistical analysis of multivariate observations. LEVEL Our aim is to present the concepts and methods of multivariate analysis at a level that is readily understandable by readers who have taken two or more statistics courses. We emphasize the applications of multivariate methods and, consequently, have attempted to make the mathematics as palatable as possible. We avoid the use of calculus. On the other hand, the concepts of a matrix and of matrix manipulations are important. We do not assume the reader is familiar with matrix algebra. Rather, we introduce matrices as they appear naturally in our discussions, and we then show how they simplify the presentation of multivariate models and techniques. The introductory account of matrix algebra, in Chapter 2, highlights the more important matrix algebra results as they apply to multivariate analysis. The Chapter 2 supplement provides a summary of matrix algebra results for those with little or no previous exposure to the subject. This supplementary material helps make the book self-contained and is used to complete proofs. The proofs may be ignored on the first reading. In this way we hope to make the book accessible to a wide audience. In our attempt to make the study of multivariate analysis appealing to a large audience of both practitioners and theoreticians, we have had to sacrifice a consistency of level. Some sections are harder than others. In particular, we have summarized a voluminous amount of material on regression in Chapter 7. The resulting presentation is rather succinct and difficult the first time through. We hope instructors will be able to compensate for the unevenness in level by judiciously choosing those sections, and subsections, appropriate for their students and by toning them down if necessary. ORGANIZATION AND APPROACH The methodological "tools" of multivariate analysis are contained in Chapters 5 through 12. These chapters represent the heart of the book, but they cannot be assimilated without much of the material in the introductory Chapters 1 through 4. Even those readers with a good knowledge of matrix algebra or those willing to accept the mathematical results on faith should, at the very least, peruse Chapter 3, "Sample Geometry," and Chapter 4, "Multivariate Normal Distribution." Our approach in the methodological chapters is to keep the discussion direct and uncluttered. Typically, we start with a formulation of the population models, delineate the corresponding sample results, and liberally illustrate everything with examples. The examples are of two types: those that are simple and whose calculations can be easily done by hand, and those that rely on real-world data and computer software. These will provide an opportunity to (1) duplicate our analyses, (2) carry out the analyses dictated by exercises, or (3) analyze the data using methods

Rewards Program