Chapter Part II Autoregressive Models Another simple time series model is the f irst order autoregression denoted by AR PDF document - DocSlides

Chapter  Part II Autoregressive Models Another simple time series model is the f irst order autoregression  denoted by AR PDF document - DocSlides

2014-12-12 227K 227 0 0

Description

Th eries is AR1 if it satis64257es the iterative equation called a dif f erence equation tt 1 where is a zeromean white noise We use the term autoregression since 1 is actually a linea tt regression model for in terms of the explanatory varia ID: 22394

Direct Link: Embed code:

Download this pdf

DownloadNote - The PPT/PDF document "Chapter Part II Autoregressive Models A..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.

Presentations text content in Chapter Part II Autoregressive Models Another simple time series model is the f irst order autoregression denoted by AR


Page 1
Chapter 3, Part II: Autoregressive Models Another simple time series model is the f irst order autoregression , denoted by AR(1). Th eries { } is AR(1) if it satisfies the iterative equation (called a dif f erence equation tt = + , (1) where { } is a zero-mean white noise. We use the term autoregression since (1) is actually a linea tt regression model for in terms of the explanatory variable . That is, is being modeled as egression on its own past. We will see that is uncorrelated with past values of the AR series tt tt Thus, represents the new contribution to , and we can think of { } as a series of random shocks innovations The definition (1) is implicit, since is defined in terms of its own past. It is useful to try to write the AR (1) process explicitly in terms of present and past innovations. Substituting for in (1 ives = + +ε=ε+αε+ x. (2) tt tt t Substituting for in (2) give =ε+αε+ + =ε+αε+αε+ x. tt t tt ontinuing this process, we see that for any =αε+ x. (3) The value of the parameter strongly affects the behavior of the AR(1) process. Suppose that in equation (3) is large, so that the underlying series { } started a long time before our observation ,...,x were formed. Then if we have <α< 1 , the last term in (3) will be negligible and the weight given to shocks which occurred a long time ago will also be extremely small. The resulting eries { } will be stationary. If, on the other hand, we have 1 , the last term in (3) will be large n magnitude and the weights given to distant shocks will be much greater than those given to more recent ones. The model is then said to be explosive , since the series mean and variance both explode a grows. The explosive model is not considered useful for economic time series. Finally, if α= 1,we
Page 2
-2- btain a useful model called the random walk which is neither explosive nor stationary. It will be dis- cussed later. Here, we assume that <α< 1 , and that is very large, so that (3) is effectively = (4 Equation (4) is called an MA ) representation for the AR (1) process, since it expresses as a movin verage of an infinite number of present and past shocks. It follows from (4) that 0 . Thus the future shock is uncorrelated with the present data. More generally, we have tt tt or all and all positive . Thus, for example, the present shock is uncorrelated with all past series values. Also, all future shocks are uncorrelated with all present and past series values. Another consequence of (4) is that var (1 +α+α+α+ ... var ε= var hhhhhh tt 2246 tt The covariance between and cov xx [( + )( )] tt tt tt tt = = var Thus, the correlation between and tt tt tt hhhhhhhhhhh = cov corr var o the degree of smoothness of { } is determined by :{ } is most smooth for near 1 , very unsmooth for near 1. tt A similar argument shows that corr = tt or all . Thus, there is always some correlation between present and future values, but this correlation
Page 3
-3- ies down as we look further into the future. The implication is that future values are always foreca- stable, but forecasting becomes more difficult (inaccurate) as the lead time increases. Forecasting for AR models is achieved by the same strategy used earlier for MA models: obtain an expression for the desired future value in terms of the { } and the { }, and then replace all unk tt nown terms by their optimal forecasts (which may be zero). Specifically, for one-step prediction in th R(1) model, we have = + nn The optimal forecast of is zero since is uncorrelated with all present and past values of { hus, the optimum forecast of is ,1 The one-step forecast error is = = ,1 ,1 o the sequence of one-step errors is a white noise. For two-step prediction, use the relation = + eplace by 0 , and replace by its optimal forecast to obtain = = x. ,2 ,1 In general, the optimal -step forecast is = x. As the lead time increases, the forecast approaches zero (i.e., the series mean) A generalization of the AR(1) model is the ’th order autoregression AR(p), generated by = + ... + + 12 pt pt The solution to this difference equation depends on the starting values of { } and on the { } series he AR(p) series will be stationary if the largest root of the equation (in the complex variable
Page 4
-4- = + ... + + tt satisfies 1 . In this case, the correlation function corr ) (which is, because of stationarity, unction of alone) will approximately lie in the region ,if is not too small. This helps determine the shape of the plot of the correlation function against . (More about this later. To forecast an AR(p) model with known parameters, use the usual strategy. For one-step fore- casts, use ... + + 11 npn nd replace by zero to obtain the forecast = ... + x. For two-step forecasts, use ,1 1 npn = ... + + 21 12 npn ,1 replace by zero and replace by its optimal forecast to obtai = ... + ,2 1 ,1 2 npn h-step forecasts are obtained similarly
Page 5
Yen Per U.S. Dollar Jan 1985 to July 1992 Exchange Rate 1985 1988 1991 120 160 200 240 1985 1988 1991 -200 20406080 Mean Adjusted Data and Random Walk Prediction 1985 1988 1991 -200 20406080 Adjusted Data and AR(1) Prediction, .5 x(t-1) This Month’s vs Last Month’s Exchange Rate Last Month This Month 120 160 200 240 120 160 200 240

About DocSlides
DocSlides allows users to easily upload and share presentations, PDF documents, and images.Share your documents with the world , watch,share and upload any time you want. How can you benefit from using DocSlides? DocSlides consists documents from individuals and organizations on topics ranging from technology and business to travel, health, and education. Find and search for what interests you, and learn from people and more. You can also download DocSlides to read or reference later.