代做ECON0060 Problem Set 4代写留学生数据结构程序

- 首页 >> OS编程

ECON0060

Problem Set 4

Question 1

Consider the panel AR(1) model

We assume that the shocks εit are independently distributed across both i and t with E(εit) = 0 and and that αi is distributed independently from all εit.

(a) Is the moment condition E[(∆yit − ∆yi,t−1β)yi,t−2] = 0 satisfied?

(b) This moment condition corresponds to the following IV estimator after first differ-encing

How many time periods T are at least needed to implement this estimator?

(c) Show that is a consistent estimator. Explain why yi,t−2 is a relevant instrument here.

(d) Could one replace the instrument yi,t−2 by the instrument yi,t+1, i.e. is the moment condition E[(∆yit − ∆yi,t−1β)yi,t+1] = 0 also satisfied?

Question 2

Consider the static binary choice logit panel data model with T = 2. The conditional probability of observing the outcome yit ∈ {0, 1} is given by

f(yit|xit, αi, β) = F(xitβ + αi) yit [1 − F(xitβ + αi)]1−yit ,

where F(u) = (1+exp(−u))−1 is the logistic cdf, xit is a 1×K vector of strictly exogenous regressors, β is a K × 1 vector of unknown parameters, and αi ∈ R is an unknown individual specific fixed effect. We assume conditional independence over time, implying that the conditional probability of observing yi = (yi1, yi2) is given by

f(yi|xi, αi, β) = f(yi1|xi1, αi, β) f(yi2|xi2, αi, β),

where xi = (xi1, xi2). We assume that we observe an iid sample of (xi , yi), i = 1, . . . , n. The goal is to estimate β.

(a) Show that

The key observation here is that αi has dropped out of the ratio of probabilities.

(b) The probability of observing yi conditional on yi being either equal to (0, 1) or (1, 0) reads

where yi ∈ {(0, 1),(1, 0)}. Show that this conditional probability is independent of αi as well.

(c) Define I = {i ∈ {1, 2, . . . , n} : yi ∈ {(0, 1),(1, 0)}}, that is I is the subset of observations i for which yi equals either (0, 1) or (1, 0). For i ∈ I we introduce the shorter notation

f*(yi |xi, β) = f(yi|yi ∈ {(0, 1),(1, 0)}, xi, αi, β).

The conditional maximum likelihood estimator for β is given by

The conditional likelihood only sums over observations i ∈ I, so the effective sam-ple size is n* = |I|, the number of elements in the set I. Standard results from (conditional) maximum likelihood theory apply, in particular we have, as n* → ∞,

where β0 denotes the true parameter, E* denotes the expectation conditional on i ∈ I (or equivalently conditional on yi ∈ {(0, 1),(1, 0)}), which for any function g(yi, xi) can be evaluated as

Here, the outer expectation is over the distribution of the regressors xi . We cannot evaluate this outer expectation further, because we do not have a model for xi. Define ∆xi = xi2 − xi1.

Show that V∞ = {E[q(xi , β0) (∆xi)'(∆xi)]} −1 for some scalar function q(xi, β0), and calculate the function q(xi, β0) explicitly.

(Hint: It is useful to realize that f*((0, 1)| xi, β) = F[∆xiβ] and f*((1, 0)| xi, β) = 1−F[∆xiβ], implying that the conditional probabilities f*(yi |xi, β) simply describe a standard logit model over the two possible outcomes yi = (0, 1) and yi = (1, 0). Thus, standard asymptotic results for the MLE of a binary choice model apply here, and you can use those.)





站长地图