Best Practices in the Evaluation of Large-scale STEM-focused Events: A Review of Recent Literature
Abstract
Each year, the National Aeronautics and Space Administration (NASA) sponsors a variety of educational events to share information with educators, students, and the general public. Intended outcomes of these events include increased interest in and awareness of the mission and goals of NASA. Events range in size from relatively small family science nights at a local school to large-scale mission and celestial event celebrations involving thousands of members of the general public. To support community members in designing event evaluations, the Science Mission Directorate (SMD) Planetary Science Forum sponsored the creation of a Best Practices Guide. The guide was generated by reviewing published large-scale event evaluation reports; however, the best practices described within are pertinent for all event organizers and evaluators regardless of event size. Each source included in the guide identified numerous challenges to conducting their event evaluation. These included difficulty in identifying extant instruments or items, collecting representative data, and disaggregating data to inform different evaluation questions. Overall, the guide demonstrates that evaluations of the large-scale events are generally done at a very basic level, with the types of data collected limited to observable demographic information and participant reactions collected via online survey. In addition to these findings, this presentation will describe evaluation best practices that will help practitioners move beyond these basic indicators and examine how to make the evaluation process an integral—and valuable—element of event planning, ultimately informing event outcomes and impacts. It will provide detailed information on five recommendations presented in the guide: 1) consider evaluation methodology, including data analysis, in advance; 2) design data collection instruments well in advance of the event; 3) collect data at different times and from multiple sources; 4) use technology to make the job easier; and 5) be aware of how challenging it is to measure impact.
- Publication:
-
AGU Fall Meeting Abstracts
- Pub Date:
- December 2015
- Bibcode:
- 2015AGUFMED43A0865S
- Keywords:
-
- 0815 Informal education;
- EDUCATION;
- 0840 Evaluation and assessment;
- EDUCATION;
- 0845 Instructional tools;
- EDUCATION;
- 0855 Diversity;
- EDUCATION