Quick comment: Sims is the guy who popularized the "theory-minimizing"
approach to the modeling of time series (e.g. VAR estimation).  Kevin
Foster, a brilliant young professor from Yale with a radical political
heart who teaches at City College, liked that approach a lot.  I
wonder how that stuff is going for him.  The idea is that, with
sufficient data, the theory suggests itself.  You just need to
stipulate that everything is related to (1) its own history, (2)
everything else on sight, and (3) everything else's history.  Then use
robust estimation methods and voila.  "Robust" here means
"non-parametric," imposing little a priori structure on the data
crunched, relying on data-intensive computational algorithms.  One
rationale for this approach is that computation just gets cheaper and
cheaper.

About Sargent there's so much that can be said.  He was close to Lucas
and those people.  Sargent and Sims' works are tightly interrelated,
however it's hard to see Sims as subscribing to Sargent's insistence
on micro-foundations.  Sims seems to me a more pragmatic, salt-water
type, and micro-foundations probably seem to him as trying to based
astrophysics on the principles of quantum mechanics.  Telegraphically:
Before the Stokey, Lucas, and Prescott book was published, Sargent's
macro books were *the* intro grad textbooks in macro.  He got curious
about the 1980s "complexity" project (along with Arrow) but then he
dropped out.  His book with Lars Ljungqvist (not to be confused with
Lars Hansen) on recursive macro will now get more recognition.  In it,
he emphasized the mathematical unity of VARs, dynamic programming
(Newton's variational calculus reworked as a recursive procedure that
is more convenient with discrete data), and the Kalman filter.  I
think this is neat.  Geometrically, regressing estimated y = X b is
the same as taking the projection X (X′X)^{-1} X′ on y, the actual
observed y.  So you can interpret Kalman as recursive projection.  The
mathematical trick to Kalman is the state-space specification
("representation") of the system.

If you have intro training in econometrics, here is an easy way to
view the Kalman filter.  By contrast with the simplest model, y_t = b0
+ b1 x_1t + e_t, where the b0, b1, and e_t are all fixed parameters to
be estimated, the Kalman filter allows one to specify the model as y_t
= b0_t + b1_t x_1t + e_t and squeeze information from the data to
update the value of the parameters over time.  I myself used the
filter (in a paper co-authored with my friends Jason Hecht and Mihail
Velikov, http://www.opf.slu.cz/kfi/icfb/proc2009/pdf/15_Huato.pdf) to
generate time series for the forex betas on main currencies for over a
thousand European firms.  The dataset on forex betas is unused as the
three of us moved on to other things.
_______________________________________________
pen-l mailing list
[email protected]
https://lists.csuchico.edu/mailman/listinfo/pen-l

Reply via email to