Hello,
I am trying to use xts as much as possible in my time series work as it seems to be the suggested way of doing things. However, I have getting a strange error.
CPI.NSA and INT are xts objects.
library(dynlm)
CPI.NSA.x <- CPI.NSA[dr1]
INT.x <- INT[dr1]
CPI.NSA.z <- as.zoo(CPI.NSA.x)
INT.z <- as.zoo(INT.x)
> dynlm(CPI.NSA.z ~ INT.z + L(CPI.NSA.z, 1))
Time series regression with "zoo" data:
Start = 1953-02-01, End = 1971-06-01
Call:
dynlm(formula = CPI.NSA.z ~ INT.z + L(CPI.NSA.z, 1))
Coefficients:
(Intercept) INT.z L(CPI.NSA.z, 1)
-0.0006795 1.0440174 -0.0869050
> dynlm(CPI.NSA.x ~ INT.x + L(CPI.NSA.x, 1))
Error in `[.xts`(a, match0(indexes, attr(a, "index")), , drop = FALSE) :
i is out of range
It was my understanding that whenever I have a function that takes zoo, I can pass it an xts and it should just work, but clearly that is not the case here.
What's going on?
Thanks for the help.