2021年4月25日星期日

Simulate two stage least squares in Python

Question

np.random.seed(123)  for r in [0,0.5,1]:      x = []      y = []      current_x = 0      current_y = 0      u = np.random.randn()      for i in range(50):          x.append(current_x)          w = np.random.randn()          current_x = w + r * u          y.append(current_y)          y = add_constant(y)          current_y = current_x + u                  res_ols = IV2SLS(np.array(y), np.array(x), None, None).fit(cov_type='unadjusted')      print(res_ols)            res_second = IV2SLS(np.array(y), exog, np.array(x), np.array(w)).fit(cov_type='unadjusted')      print(res_second)  

I am not sure how to simulate the exogenous regressor. Should I add a constant as exogenous? Could someone find the error in my code? Appreciate a lot!

https://stackoverflow.com/questions/67250157/simulate-two-stage-least-squares-in-python April 25, 2021 at 01:41PM

没有评论:

发表评论