did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780387953519

Introduction to Time Series and Forecasting

by ;
  • ISBN13:

    9780387953519

  • ISBN10:

    0387953515

  • Edition: 2nd
  • Format: Hardcover
  • Copyright: 3/1/2002
  • Publisher: Springer Verlag

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $119.00 Save up to $88.96
  • Rent Book $39.57
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    IN STOCK USUALLY SHIPS IN 24 HOURS.
    HURRY! ONLY 0 COPY IN STOCK AT THIS PRICE
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

Supplemental Materials

What is included with this book?

Summary

This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied in economics, engineering, and the natural and social sciences. The book assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This second edition contains detailed instructions on the use of the new totally windows-based computer package ITSM2000, the student version of which is included with the text. Expanded treatments are also given of several topics treated only briefly in the first edition. These include regression with time series errors, which plays an important role in forecasting and inference, and ARCH and GARCH models, which are widely used for the modeling of financial time series. These models can be fitted using the new version of ITSM.The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space models, with an optional chapter on spectral analysis. Additional topics include the Burg and Hannan-Rissanen algorithms, unit roots, the EM algorithm, structural models, generalized state-space models with applications to time series of count data, exponential smoothing, the Holt-Winters and ARAR forecasting algorithms, transfer function models and intervention analysis. Brief introductions are also given to cointegration and to non-linear, continuous-time and long-memory models.

Table of Contents

Preface vii
Introduction
1(44)
Examples of Time Series
1(5)
Objectives of Time Series Analysis
6(1)
Some Simple Time Series Models
7(8)
Some Zero-Mean Models
8(1)
Models with Trend and Seasonality
9(5)
A General Approach to Time Series Modeling
14(1)
Stationary Models and the Autocorrelation Function
15(8)
The Sample Autocorrelation Function
18(3)
A Model for the Lake Huron Data
21(2)
Estimation and Elimination of Trend and Seasonal Components
23(12)
Estimation and Elimination of Trend in the Absence of Seasonality
24(7)
Estimation and Elimination of Both Trend and Seasonality
31(4)
Testing the Estimated Noise Sequence
35(10)
Problems
40(5)
Stationary Processes
45(38)
Basic Properties
45(6)
Linear Processes
51(4)
Introduction to ARMA Processes
55(2)
Properties of the Sample Mean and Autocorrelation Function
57(6)
Estimation of μ
58(1)
Estimation of γ(.) and ρ(.)
59(4)
Forecasting Stationary Time Series
63(14)
The Durbin-Levinson Algorithm
69(2)
The Innovations Algorithm
71(4)
Prediction of a Stationary Process in Terms of Infinitely Many Past Values
75(2)
The Wold Decomposition
77(6)
Problems
78(5)
ARMA Models
83(28)
ARMA (p, q) Processes
83(5)
The ACF and PACF of an ARMA(p, q) Process
88(12)
Calculation of the ACVF
88(6)
The Autocorrelation Function
94(1)
The Partial Autocorrelation Function
94(2)
Examples
96(4)
Forecasting ARMA Processes
100(11)
Problems
108(3)
Spectral Analysis
111(26)
Spectral Densities
112(9)
The Periodogram
121(6)
Time-Invariant Linear Filters
127(5)
The Spectral Density of an ARMA Process
132(5)
Problems
134(3)
Modeling and Forecasting with ARMA Processes
137(42)
Preliminary Estimation
138(20)
Yule-Walker Estimation
139(8)
Burg's Algorithm
147(3)
The Innovations Algorithm
150(6)
The Hannan-Rissanen Algorithm
156(2)
Maximum Likelihood Estimation
158(6)
Diagnostic Checking
164(3)
The Graph of {Rt, t = 1,...,n}
165(1)
The Sample ACF of the Residuals
166(1)
Tests for Randomness of the Residuals
166(1)
Forecasting
167(2)
Order Selection
169(10)
The FPE Criterion
170(1)
The AICC Criterion
171(3)
Problems
174(5)
Nonstationary and Seasonal Time Series Models
179(44)
ARIMA Models for Nonstationary Time Series
180(7)
Identification Techniques
187(6)
Unit Roots in Time Series Models
193(5)
Unit Roots in Autoregressions
194(2)
Unit Roots in Moving Averages
196(2)
Forecasting ARIMA Models
198(5)
The Forecast Function
200(3)
Seasonal ARIMA Models
203(7)
Forecasting SARIMA Processes
208(2)
Regression with ARMA Errors
210(13)
OLS and GLS Estimation
210(3)
ML Estimation
213(6)
Problems
219(4)
Multivariate Time Series
223(36)
Examples
224(5)
Second-Order Properties of Multivariate Time Series
229(5)
Estimation of the Mean and Covariance Function
234(7)
Estimation of μ
234(1)
Estimation of Γ(h)
235(2)
Testing for Independence of Two Stationary Time Series
237(1)
Bartlett's Formula
238(3)
Multivariate ARMA Processes
241(3)
The Covariance Matrix Function of a Causal ARMA Process
244(1)
Best Linear Predictors of Second-Order Random Vectors
244(2)
Modeling and Forecasting with Multivariate AR Processes
246(8)
Estimation for Autoregressive Processes Using Whittle's Algorithm
247(3)
Forecasting Multivariate Autoregressive Processes
250(4)
Cointegration
254(5)
Problems
256(3)
State-Space Models
259(58)
State-Space Representations
260(3)
The Basic Structural Model
263(4)
State-Space Representation of ARIMA Models
267(4)
The Kalman Recursions
271(6)
Estimation For State-Space Models
277(6)
State-Space Models with Missing Observations
283(6)
The EM Algorithm
289(3)
Generalized State-Space Models
292(25)
Parameter-Driven Models
292(7)
Observation-Driven Models
299(12)
Problems
311(6)
Forecasting Techniques
317(14)
The ARAR Algorithm
318(4)
Memory Shortening
318(1)
Fitting a Subset Autoregression
319(1)
Forecasting
320(1)
Application of the ARAR Algorithm
321(1)
The Holt-Winters Algorithm
322(4)
The Algorithm
322(2)
Holt-Winters and ARIMA Forecasting
324(2)
The Holt-winters Seasonal Algorithm
326(2)
The Algorithm
326(2)
Holt-Winters Seasonal and ARIMA Forecasting
328(1)
Choosing a Forecasting Algorithm
328(3)
Problems
330(1)
Further Topics
331(38)
Transfer Function Models
331(9)
Prediction Based on a Transfer Function Model
337(3)
Intervention Analysis
340(3)
Nonlinear Models
343(14)
Deviations from Linearity
344(1)
Chaotic Deterministic Sequences
345(2)
Distinguishing Between White Noise and iid Sequences
347(1)
Three Useful Classes of Nonlinear Models
348(1)
Modeling Volatility
349(8)
Continuous-Time Models
357(4)
Long-Memory Models
361(8)
Problems
365(4)
A. Random Variables and Probability Distributions 369(14)
A.1. Distribution Functions and Expectation
369(5)
A.2. Random Vectors
374(3)
A.3. The Multivariate Normal Distribution
377(6)
Problems
381(2)
B. Statistical Complements 383(10)
B.1. Least Squares Estimation
383(3)
B.1.1. The Gauss-Markov Theorem
385(1)
B.1.2. Generalized Least Squares
386(1)
B.2. Maximum Likelihood Estimation
386(2)
B.2.1. Properties of Maximum Likelihood Estimators
387(1)
B.3. Confidence Intervals
388(1)
B.3.1. Large-Sample Confidence Regions
388(1)
B.4. Hypothesis Testing
389(4)
B.4.1. Error Probabilities
390(1)
B.4.2. Large-Sample Tests Based on Confidence Regions
390(3)
C. Mean Square Convergence 393(2)
C.1. The Cauchy Criterion
393(2)
D. An ITSM Tutorial 395(28)
D.1. Getting Started
396(1)
D.1.1. Running ITSM
396(1)
D.2. Preparing Your Data for Modeling
396(7)
D.2.1. Entering Data
397(1)
D.2.2. Information
397(1)
D.2.3. Filing Data
397(1)
D.2.4. Plotting Data
398(1)
D.2.5. Transforming Data
398(5)
D.3. Finding a Model for Your Data
403(8)
D.3.1. Autofit
403(1)
D.3.2. The Sample ACF and PACF
403(1)
D.3.3. Entering a Model
404(2)
D.3.4. Preliminary Estimation
406(2)
D.3.5. The AICC Statistic
408(1)
D.3.6. Changing Your Model
408(1)
D.3.7. Maximum Likelihood Estimation
409(1)
D.3.8. Optimization Results
410(1)
D.4. Testing Your Model
411(4)
D.4.1. Plotting the Residuals
412(1)
D.4.2. ACF/PACF of the Residuals
412(2)
D.4.3. Testing for Randomness of the Residuals
414(1)
D.5. Prediction
415(1)
D.5.1. Forecast Criteria
415(1)
D.5.2. Forecast Results
415(1)
D.6. Model Properties
416(5)
D.6.1. ARMA Models
417(1)
D.6.2. Model ACF, PACF
418(1)
D.6.3. Model Representations
419(1)
D.6.4. Generating Realizations of a Random Series
420(1)
D.6.5. Spectral Properties
421(1)
D.7. Multivariate Time Series
421(2)
References 423(6)
Index 429

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Rewards Program