Innovations in REU Program Evaluation: Maximizing Program Responsiveness

Moving beyond the end of the program survey, a repeat-design program evaluation approach for an REU is offered.

Suzanne M. Coshow

11/13/20241 min read

An REU student's hands holding a petrie dish with a polymer.
An REU student's hands holding a petrie dish with a polymer.

Twenty years ago, a short intensive, project-based, research experience program like the Research Experience for Undergraduates program would be evaluated with an after the program survey of experiences to the participants. Indeed, I did exactly that. However, one downside with that approach is the often missed opportunity to be responsive to student needs during the program itself. This is especially important in a relatively short summer experience program, as opposed to the much longer duration of a doctoral program where that longer annual survey is still often the best choice. For a new NSF REU in 2022, I designed an innovative evaluation approach involving repeat design feedback in short Google forms.

Overview: This REU program is organized around the 9 week schedule and includes: 1) specific engineering concepts; 2) professional development; 3) independent research projects; and 4) communicating research.

Pre / Post Program Assessment: Pre-program baseline data is collected from all participants to capture their prior experience and knowledge about the specific engineering concepts, professional development, substantive research topics and methods. They are also asked about their experiences and training around communicating research. This baseline data allowed us to assess gains in participants’ deep understanding of engineering concepts, chemical engineering research, professional development, and their confidence in communicating research to others. We employ statistical tests of significance to assess these gains.

Program Delivery Feedback Processes: Each week, students are asked to complete an “Activities Log” so that the program can track and monitor the experiences of the participants. One purpose is to facilitate the responsiveness of the program so that the program could have the opportunity to catch any problems or issues quickly. These were reviewed each week by the evaluator, program faculty and staff and/or the PIs. In one recent case, the program was able to provide a programmatic response the very day the feedback was received involving IT access to campus resources.


Facilitating quick responsiveness is the primary purpose of the weekly Activities Logs, but they also provide more detailed data about the students’ experiences with each engineering, professional development, and communicating research topic.