Monte Carlo simulation of a linear regression model with a lagged dependet variable
2 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
I have a linear regression model with a lagged dependet variable: y_t=beta_0 + beta_1 * y_{t-1} + u_t The initial starting point is y_0=2 and I know the real coefficients of beta_0=2 and beta_1=1. How can I perform a Monte Carlo Simulation that´s estimating the bias of the OLS coefficients?
0 Kommentare
Antworten (0)
Siehe auch
Kategorien
Mehr zu Linear and Nonlinear Regression finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!