Using Mixed Methods and Collaboration to Evaluate an Education and Public Outreach Program (Invited)
Abstract
Traditional indicators (such as the number of participants or Likert-type ratings of participant perceptions) are often used to provide stakeholders with basic information about program outputs and to justify funding decisions. However, use of qualitative methods can strengthen the reliability of these data and provide stakeholders with more meaningful information about program challenges, successes, and ultimate impacts (Stern, Stame, Mayne, Forss, David & Befani, 2012). In this session, presenters will discuss how they used a mixed methods evaluation to determine the impact of an education and public outreach (EPO) program. EPO efforts were intended to foster more effective, sustainable, and efficient utilization of science discoveries and learning experiences through three main goals 1) increase engagement and support by leveraging of resources, expertise, and best practices; 2) organize a portfolio of resources for accessibility, connectivity, and strategic growth; and 3) develop an infrastructure to support coordination. The evaluation team used a mixed methods design to conduct the evaluation. Presenters will first discuss five potential benefits of mixed methods designs: triangulation of findings, development, complementarity, initiation, and value diversity (Greene, Caracelli & Graham, 2005). They will next demonstrate how a 'mix' of methods, including artifact collection, surveys, interviews, focus groups, and vignettes, was included in the EPO project's evaluation design, providing specific examples of how alignment between the program theory and the evaluation plan was best achieved with a mixed methods approach. The presentation will also include an overview of different mixed methods approaches and information about important considerations when using a mixed methods design, such as selection of data collection methods and sources, and the timing and weighting of quantitative and qualitative methods (Creswell, 2003). Ultimately, this presentation will provide insight into how a mixed methods approach was used to provide stakeholders with important information about progress toward program goals. Creswell, J.W. (2003). Research design: Qualitative, quantitative, and mixed approaches. Thousand Oaks, CA: Sage. Greene, J. C., Caracelli, V. J., & Graham, W. D. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274. Stern, E; Stame, N; Mayne, J; Forss, K; Davis, R & Befani, B (2012) Broadening the range of designs and methods for impact evaluation. Department for International Development.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2013
- Bibcode:
- 2013AGUFMED53F0676S
- Keywords:
-
- 0840 EDUCATION Evaluation and assessment