Files in this item



application/pdfJimenez_Maria.pdf (2MB)
(no description provided)PDF


Title:An examination of the practice of culturally responsive evaluation: a case study of the Girls Adventures in Mathematics, Engineering, and Science (G.A.M.E.S.) summer camp at the University of Illinois
Author(s):Jimenez, Maria
Director of Research:DeStefano, Lizanne
Doctoral Committee Chair(s):Anderson, James D.
Doctoral Committee Member(s):DeStefano, Lizanne; Hood, Stafford; Trent, William T.
Department / Program:Educ Policy, Orgzn & Leadrshp
Discipline:Educational Policy Studies
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):educational evaluation and policy
Science, Technology, Engineering, and Mathematics (STEM) evaluation
Abstract:In recent decades, there has been an expansion of evaluation approaches that seek to respond to the cultural context of the program that is being evaluated. Of these, culturally responsive evaluation is the most well-known evaluation approach. The purpose of culturally responsive evaluation is to take into account the cultural context of the program in the evaluation. To do so, it incorporates the culture of the program and participants throughout the evaluation process: preparation, design, and implementation. This instrumental case study examines how a culturally responsive evaluation approach was applied to evaluate a Science, Technology, Engineering, and Mathematics (STEM) program. More specifically, it highlights the specific culturally responsive evaluation strategies that were employed throughout the evaluation process. Further, it highlights the strengths and challenges associated with a culturally responsive evaluation approach. The results of the present study support the future use of culturally responsive evaluation approach. While challenges common to culturally responsive evaluation were evident in the study, the strengths were certainly outweighed by the challenges. First, stakeholder engagement increased the overall credibility of the evaluation. For example, the input and advice of the program staff on the evaluation increased the quality of the design of the evaluation and generated evaluation results that were useful and meaningful to program staff and participants. Secondly, a culturally diverse evaluation team increased the validity of the evaluation through the use of shared lived experiences which allowed for the accurate collection, analysis, and interpretation of evaluation findings. Thirdly, a robust evaluation design (mixed methods) that was inclusive of culturally responsive evaluation instruments, analysis, and interpretation also increased the validity of the overall evaluation and results. Further, analysis of data from a culturally responsive evaluation lens provided for the accurate depiction of the experiences of the participants, and in some cases revealed cultural differences that were addressed important to the evaluation’s purpose. The present study is significant in that it adds to the current body of literature on culturally responsive evaluation. In addition, it calls for additional strategies and guidelines for conducting culturally responsive evaluation strategies in order to increase the practice of culturally responsive evaluation both domestically and internationally.
Issue Date:2012-09-18
Rights Information:Copyright 2012 Maria Jimenez
Date Available in IDEALS:2012-09-18
Date Deposited:2012-08

This item appears in the following Collection(s)

Item Statistics