Long-Memory Time Series - Theory and Methods - W. Palma

Long-Memory Time Series - Theory and Methods - W. Palma

(Parte 1 de 4)

TIME SERIES Theory and Methods

Wilfred0 Palma Pontificia Universidad Catolica de Chile

i/ ~ZI WI LEY 1 : 2007 f r

This Page Intentionally Left Blank This Page Intentionally Left Blank

THE WlLEY 6ICENTENNIAL-KNOWLEDGE FOR GENERATIONS

ach generation has its unique needs and aspirations. When Charles Wiley first opened his small printing shop in lower Manhattan in 1807, it was a generation of houndless potential searching for an identity. And we were there, helping to define a new American literary tradition. Over half a century later. in the midst of the Second Industrial Revolution. it was a generation focused on building the future. Once again, we were there, supplying the critical scientific, technical. and engineering knowledge that helped frame the world. Throughout the 20th

Century, and into the new millenniu~n, nations began to reach out beyond their own borders and a new international community was born. Wiley was there, expanding its operations around the world to enable a global exchange of ideas, opinions, and know-how.

For 200 years, Wiley has been an integral part of each genetation's journey, enabling the flow of information and understanding neccssary to meet thcir necds 'and fulfill their aspirations. Today, bold new technologies are changing the way we live and learn. Wiley will be there, providing you the must-have knowledge YOU need to imagine new worlds, ncw possibilities, and new opportunities.

Generations come and go, but you can always count on Wiley to provide you the knowledge you need, when and where you need it! 4

TIME SERIES Theory and Methods

Wilfred0 Palma Pontificia Universidad Catolica de Chile

i/ ~ZI WI LEY 1 : 2007 f r

Copyright 0 2007 by John Wiley & Sons, Inc. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey. Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 2 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax

(978) 750-4470, or on the web at w.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., River Street, Hoboken, NJ 07030, (201) 748-601 fax (201) 748-6008, or online at http://www.wiley.com/go/permission

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic format. For information about Wiley products, visit our web site at w.wiley.com.

Wiley Bicentennial Logo: Richard J. Pacific0

Library of Congress Cataloging-in-Publication Data is available.

ISBN 978-0-470-1 1402-5

Printed in the United States of America I0987654321

Preface Acronyms

1 Stationary Processes

1.1 Fundamental Concepts

1.1.1 Stationarity 1.1.2 Singularity and Regularity 1.1.3 Wold Decomposition Theorem

1.1.4 Causality 1.1.5 Invertibility 1.1.6 Best Linear Predictor 1.1.7 Szego-Kolmogorov Formula 1.1.8 Ergodicity 1.1.9 Martingales 1.1.10 Cumulants 1.1.1 1 Fractional Brownian Motion

1.1.12 Wavelets

Xlll xvii

Vi CONTENTS

1.2 Bibliographic Notes Problems 15 16

2 State Space Systems

2.1 Introduction 2.1.1 Stability 2.1.2 Hankel Operator

2.1.3 Observability 2.1.4 Controllability 2.1.5 Minimality

2.2.1

2.2.2 2.2.3

2.3 Estimation of the State 2.3.1 State Predictor 2.3.2 State Filter 2.3.3 State Smoother 2.3.4 Missing Observations 2.3.5 Steady State System

2.3.6 Prediction of Future Observations

2.2 Representations of Linear Processes

State Space Form to Wold Decomposition Wold Decomposition to State Space Form Hankel Operator to State Space Form

2.4 Extensions 2.5 Bibliographic Notes Problems

3 Long-Memory Processes

3.1 Defining Long Memory 3.1.1 Alternative Definitions 3.1.2 Extensions

3.2.1 Stationarity, Causality, and Invertibility 3.2.2 Infinite AR and MA Expansions 3.2.3 Spectral Density 3.2.4 Autocovariance Function 3.2.5 Sample Mean

3.2.6 Partial Autocorrelations 3.2.7 lllustrations

3.2.8 Approximation of Long-Memory Processes

3.2 AFWIMA Processes

CONTENTS Vii

3.3 Fractional Gaussian Noise 3.3.1 Sample Mean 3.4 Technical Lemmas

3.5 Bibliographic Notes Problems

4 Estimation Methods

4.1 Maximum-Likelihood Estimation 4.1.1 Cholesky Decomposition Method 4.1.2 Durbin-Levinson Algorithm

4.1.3 Computation of Autocovariances 4.1.4 State Space Approach

4.2 Autoregressive Approximations 4.2.1 Haslett-Raftery Method

4.2.2 Beran Approach 4.2.3 A State Space Method

4.3 Moving-Average Approximations 4.4 Whittle Estimation 4.4.1 Other versions

4.4.2 Non-Gaussian Data 4.4.3 Semiparametric Methods

4.5.1 A Regression Method

4.5.2 Rescaled Range Method 4.5.3 Variance Plots

4.5.4 Detrended Fluctuation Analysis 4.5.5 A Wavelet-Based Method

4.5 Other Methods

4.6 Numerical Experiments 4.7 Bibliographic Notes Problems

5 Asymptotic Theory

5.1 Notation and Definitions 5.2 Theorems

5.2.1 Consistency 5.2.2 Central Limit Theorem

5.2.3 Efficiency 5.3 Examples

Viii CONTENTS

5.4 Illustration 5.5 Technical Lemmas 5.6 Bibliographic Notes Problems

6 Heteroskedastlc Models

6.1 Introduction 6.2 ARFIMA-GARCH Model 6.2.1 Estimation

6.3 Other Models 6.3.1 Estimation 6.4 Stochastic Volatility 6.4.1 Estimation

6.5 Numerical Experiments 6.6 Application 6.6.1 Model without Leverage

6.6.2 Model with Leverage 6.6.3 Model Comparison

Problems 6.7 Bibliographic Notes

7 Transformations

7.1 Transformations of Gaussian Processes 7.2 Autocorrelation of Squares 7.3 Asymptotic Behavior 7.4 Illustrations

7.5 Bibliographic Notes Problems

8 Bayesian Methods

8.1 Bayesian Modeling

8.2 Markov Chain Monte Car10 Methods 8.2.1 Metropolis-Hastings Algorithm 8.2.2 Gibbs Sampler 8.2.3 Overdispersed Distributions 8.3 Monitoring Convergence 8.4 A Simulated Example

CONTENTS iX

8.5 Data Application 8.6 Bibliographic Notes Problems

9 Prediction

9.1 One-Step Ahead Predictors 9.1.1 Infinite Past 9.1.2 Finite Past 9.1.3 An Approximate Predictor 9.2 Multistep Ahead Predictors

9.2.1 Infinite Past 9.2.2 Finite Past 9.3 Heteroskedastic Models 9.3.1 Prediction of Volatility

9.4 Illustration 9.5 Rational Approximations 9.5.1 Illustration 9.6 Bibliographic Notes Problems

10 Regression 10.1 Linear Regression Model

10.2 Properties of the LSE 10.1.1 Grenander Conditions

10.2.1 Consistency 10.2.2 Asymptotic Variance 10.2.3 Asymptotic Normality

10.3.1 Efficiency of the LSE Relative to the BLUE

10.4.1 Consistency

10.4.2 Asymptotic Variance 10.4.3 Normality 10.4.4 Relative Efficiency

10.5 Polynomial Trend 10.5.1 Consistency 10.5.2 Asymptotic Variance 10.5.3 Normality

10.3 Properties of the BLUE 10.4 Estimation of the Mean

X CONTENTS 10.5.4 Relative Efficiency

10.6.1 Consistency 10.6.2 Asymptotic Variance 10.6.3 Normality 10.6.4 Efficiency

10.6 Harmonic Regression

10.7 Illustration: Air Pollution Data 10.8 Bibliographic Notes Problems

1 Missing Data

1 1.1 Motivation 1 I .2 Likelihood Function with Incomplete Data 1 1.2.1 Integration 1.2.2 Maximization 1.2.3 Calculation of the Likelihood Function 1.2.4 Kalman Filter with Missing Observations 1 1.3 Effects of Missing Values on ML Estimates 1 1.3.1 Monte Carlo Experiments 1 1.4 Effects of Missing Values on Prediction

1 1.5 Illustrations 1 1.6 Interpolation of Missing Data 1 1.6.1 Bayesian Imputation 1 1.6.2 A Simulated Example

1 1.7 Bibliographic Notes Problems

12 Seasonality

12.1 A Long-Memory Seasonal Model 12.2 Calculation of the Asymptotic Variance 12.3 Autocovariance Function 12.4 Monte Carlo Studies

12.5 Illustration

12.6 Bibliographic Notes Problems

References

Topic Index Author Index

CONTENTS Xi

This Page Intentionally Left Blank This Page Intentionally Left Blank

During the last decades long-memory processes have evolved into a vital and impor- tant part of the time series analysis. Long-range-dependent processes are character- ized by slowly decaying autocorrelations or by a spectral density exhibiting a pole at the origin. These features change dramatically the statistical behavior of estimates and predictions. As a consequence, many of the theoretical results and methodologies used for analyzing short-memory time series, for instance, ARMA processes, are no longer appropriate for long-memory models.

This book aims to provide an overview of the theory and methods developed to deal with long-range-dependent data as well as describe some applications of these methodologies to real-life time series. It is intended to be a text for a graduate course and to be helpful to researchers and practitioners. However, it does not attempt to cover all of the relevant topics in this field.

Some basic knowledge of calculus and linear algebra including derivatives, inte- grals and matrices is required for understanding most results in this book. Apart from this, the text intends to be self-contained in terms of other more advanced concepts. In fact, Chapter 1 of this book offers a brief discussion of fundamental mathematical and probabilistic concepts such as Hilbert spaces, orthogonal projections, stationarity, and ergodicity, among others. Definitions and basic properties are presented and fur- ther readings are suggested in a bibliographic notes section. This chapter ends with a number of proposed exercises. Furthermore, Chapter 2 describes some fundamental xiii concepts on state space systems and Kalman filter equations. As discussed in this chapter, state space systems offer an alternative representation of time series models which may be very useful for calculating estimates and predictors, especially in the presence of data gaps. In particular, we discuss applications of state space techniques to parameter estimation in Chapter 4, to missing values in Chapter 1 1, and to seasonal models in Chapter 12.

Even though it seems to be a general agreement that in order to have long memory a time series must exhibit slowly decaying autocorrelations, the formal definition of a long-range-dependent process is not necessarily unique. This issue is discussed in Chapter 3 where several mathematical definitions of long memory are reviewed. Chapter 4 is devoted to the analysis of a number of widely used parameter esti- mation methods for strongly-dependent time series models. The methodologies are succinctly presented and some asymptotic results are discussed. Since a critical prob- lem with many of these estimation methods is their computational implementation, some specific aspects such as algorithm efficiency and numerical complexity are also analyzed. A number of simulations and practical applications complete this chap- ter. The statistical analysis of the large-sample properties of the parameter estimates of long-memory time series models described in Chapter 4 is different and usually more complex than for short-memory processes. To illustrate this difference, Chap- ter 5 addresses some of the technical aspects of the proof of the consistency, central limit theorem, and efficiency of the maximum-likelihood estimators in the context of

long-range-dependent processes.

Chapter 6 and Chapter 7 deal with heteroskedastic time series. These processes, frequently employed to model economic and financial data, assume that the condi- tional variance of an observation given its past may vary with time. While Chapter 6 describes several widely used heteroskesdatic models, Chapter 7 characterizes these processes in terms of their memory and the memory of some of their nonlinear trans- formations. On the other hand, Chapter 8 discusses Bayesian methods for dealing with strongly dependent data. Special attention is dedicated to iterative procedures such the Metropolis-Hastings algorithm and the Gibbs sampler. Prediction of long-memory time series models is reviewed in Chapter 9. This chapter summarizes several results on the prediction of stationary linear processes and discusses some specific methods for heteroskedastic time series. Linear regression models with strongly dependent disturbances are addressed in Chapter 10. In particular, some large sample statistical properties of least squares and best linear unbiased estimators are analyzed, includ- ing consistency, asymptotic normality, and efficiency. Furthermore, these results are applied to the estimation of polynomial and harmonic regressions.

Most of the methods reviewed up to Chapter 10 are only applicable to complete time series. However, in many practical applications there are missing observations. This problem is analyzed in Chapter 1 which describes some state space techniques for dealing with data gaps. Finally, Chapter 12 examines some methodologies for the treatment of long-memoxy processes which display, in addition, cyclical or seasonal behavior. Apart from discussing some theoretical and methodological aspects of the maximum-likelihood and quasi-maximum-likelihood estimation, this chapter illus- trates the finite sample performance of these techniques by Monte Car10 simulations and a real-life data application. It is worth noting that similarly to Chapter 1, every chapter of this book ends with a bibliographic notes section and a list of proposed problems.

I wish to express my deep gratitude to Jay Kadane for many insightful discussions on time series statistical modeling, for encouraging me to write this book, and for valuable comments on a previous version of the manuscript. I am also indebted to many coauthors and colleagues, some of the results described in this text reflect part of that fruitful collaboration. I would like to thank Steve Quigley, Jacqueline Palmieri, Christine Punzo, and the editorial staff at John Wiley & Sons for their continuous support and for making the publication of this book possible. Special thanks to Anthony Brockwell for several constructive suggestions on a preliminary version of this work. I am also grateful of the support from the Department of Statistics and the Faculty of Mathematics at the Pontificia Universidad Cat6lica de Chile. Several chapters of this book evolved from lecture notes for graduate courses on time series analysis. I would like to thank many students for useful remarks on the text and for trying out the proposed exercises. Financial support from Fondecyt Grant 1040934 is gratefully acknowledged.

(Parte 1 de 4)

Comentários