A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries. / Hansen, Henrik; Klejnstrup, Ninja Ritter; Andersen, Ole Winckler.

I: American Journal of Evaluation, Bind 34, Nr. 3, 09.2013, s. 320-338.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Hansen, H, Klejnstrup, NR & Andersen, OW 2013, 'A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries', American Journal of Evaluation, bind 34, nr. 3, s. 320-338. https://doi.org/10.1177/1098214013476915

APA

Hansen, H., Klejnstrup, N. R., & Andersen, O. W. (2013). A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries. American Journal of Evaluation, 34(3), 320-338. https://doi.org/10.1177/1098214013476915

Vancouver

Hansen H, Klejnstrup NR, Andersen OW. A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries. American Journal of Evaluation. 2013 sep.;34(3):320-338. https://doi.org/10.1177/1098214013476915

Author

Hansen, Henrik ; Klejnstrup, Ninja Ritter ; Andersen, Ole Winckler. / A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries. I: American Journal of Evaluation. 2013 ; Bind 34, Nr. 3. s. 320-338.

Bibtex

@article{ac0c93c5714c42f98a3c3200fe14636d,
title = "A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries",
abstract = "There is a long-standing debate as to whether nonexperimental estimators of causal effects of social programs can overcome selection bias. Most existing reviews either are inconclusive or point to significant selection biases in nonexperimental studies. However, many of the reviews, the so-called between-studies, do not make direct comparisons of the estimates. We survey four impact studies, all using data from development interventions that directly compare experimental and nonexperimental impact estimates. Our review illustrates that when the program participation process is well understood, and correctly modeled, then the nonexperimental estimators can overcome the selection bias to the same degree as randomized controlled trials. Hence, we suggest that evaluators of development programs aim to be careful and precise in the formulation of the statistical model for the assignment into the program and also to use the assignment information for model-based systematic sampling.",
author = "Henrik Hansen and Klejnstrup, {Ninja Ritter} and Andersen, {Ole Winckler}",
year = "2013",
month = sep,
doi = "10.1177/1098214013476915",
language = "English",
volume = "34",
pages = "320--338",
journal = "American Journal of Evaluation",
issn = "1098-2140",
publisher = "SAGE Publications",
number = "3",

}

RIS

TY - JOUR

T1 - A Comparison of Model-Based and Design-Based Impact Evaluations of Interventions in Developing Countries

AU - Hansen, Henrik

AU - Klejnstrup, Ninja Ritter

AU - Andersen, Ole Winckler

PY - 2013/9

Y1 - 2013/9

N2 - There is a long-standing debate as to whether nonexperimental estimators of causal effects of social programs can overcome selection bias. Most existing reviews either are inconclusive or point to significant selection biases in nonexperimental studies. However, many of the reviews, the so-called between-studies, do not make direct comparisons of the estimates. We survey four impact studies, all using data from development interventions that directly compare experimental and nonexperimental impact estimates. Our review illustrates that when the program participation process is well understood, and correctly modeled, then the nonexperimental estimators can overcome the selection bias to the same degree as randomized controlled trials. Hence, we suggest that evaluators of development programs aim to be careful and precise in the formulation of the statistical model for the assignment into the program and also to use the assignment information for model-based systematic sampling.

AB - There is a long-standing debate as to whether nonexperimental estimators of causal effects of social programs can overcome selection bias. Most existing reviews either are inconclusive or point to significant selection biases in nonexperimental studies. However, many of the reviews, the so-called between-studies, do not make direct comparisons of the estimates. We survey four impact studies, all using data from development interventions that directly compare experimental and nonexperimental impact estimates. Our review illustrates that when the program participation process is well understood, and correctly modeled, then the nonexperimental estimators can overcome the selection bias to the same degree as randomized controlled trials. Hence, we suggest that evaluators of development programs aim to be careful and precise in the formulation of the statistical model for the assignment into the program and also to use the assignment information for model-based systematic sampling.

U2 - 10.1177/1098214013476915

DO - 10.1177/1098214013476915

M3 - Journal article

VL - 34

SP - 320

EP - 338

JO - American Journal of Evaluation

JF - American Journal of Evaluation

SN - 1098-2140

IS - 3

ER -

ID: 47902046