Increasing the use of evaluation data collection in an EPO program
Abstract
Over the past two years, the Incorporated Research Institutions for Seismology Education and Public Outreach (EPO) program has sought to increase the evaluation rigor of its programs and products. Specifically we sought to make evaluation an integral part of our work; enabling staff to demonstrate why we do the activities we do, enhancing the impact or our products/programs, and empowering staff to make evidence-based claims. The Collaborative Impact Analysis Method (Davis and Scalice, 2015) was selected as it allowed us to combine staff's knowledge of programs, audiences and content with the expertise of an outside evaluation expert, through consultations and a qualitative rubric assessing the initial state of each product/program's evaluation. Staff then developed action plans to make improvements to the programs over time. A key part of the initial action plans has been the collection and analysis of new evaluation data. The most frequently used tools were surveys as they were relatively straightforward to implement and analyze, and could be adapted for different situations. Examples include: brand awareness, value of booth interactions, assessing community interest in a data app, and user surveys of social media and specific web pages. Other evaluation activities included beta testing of new software, and interviews with students and faculty involved in summer field experiences. The surveys have allowed us to document increased impact in some areas, to improve the usability of products and activities, and to provide baseline impact data. The direct involvement of staff in the process has helped staff appreciate the value of evaluation, but there are also challenges to this approach. Since many of the surveys are developed and conducted by EPO staff, rather than being primarily handled by the evaluator, the process takes considerably more staff time to implement. We are still determining how to best manage and present the data and analysis; our current approach is to post evaluation reports on our EPO website so that other groups may be able to benefit from our evaluation results. Davis, H. & Scalice, D. (2015). Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method. Abstract ED53D-0871, 2015 Fall Meeting, AGU.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2017
- Bibcode:
- 2017AGUFMED43A..01T
- Keywords:
-
- 0805 Elementary and secondary education;
- EDUCATION;
- 0810 Post-secondary education;
- EDUCATION;
- 0815 Informal education;
- EDUCATION;
- 0840 Evaluation and assessment;
- EDUCATION