A data-augmented approach to evaluating teamwork using log data from digital collaboration tools
Shi, Wenxuan
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/130030
Description
Title
A data-augmented approach to evaluating teamwork using log data from digital collaboration tools
Author(s)
Shi, Wenxuan
Issue Date
2025-07-10
Director of Research (if dissertation) or Advisor (if thesis)
Bailey, Brian
Doctoral Committee Chair(s)
Bailey, Brian
Committee Member(s)
Sundaram, Hari
Mercier, Emma
Sterman, Sarah
Wang, Dakuo
Department of Study
Siebel School Comp & Data Sci
Discipline
Computer Science
Degree Granting Institution
University of Illinois Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
Human-computer Interaction
Collaboration
Teamwork
Team Assessment
Peer Evaluations
Log Data
Data-driven Assessment
Interface Design
Language
eng
Abstract
Teamwork is a critical skill in both educational and professional settings, yet evaluating individual contributions to collaborative work remains a persistent challenge. Instructors often rely on peer evaluations to assess teamwork, but these evaluations are prone to social and cognitive biases, including memory limitations, inconsistent standards, and interpersonal pressures. At the same time, the increasing use of digital collaboration tools such as Google Docs, GitHub, and Slack presents a new opportunity: these tools generate rich log data that can reveal how teams interact, coordinate, and contribute over time. This dissertation presents the design and evaluation of a data-augmented peer evaluation system that extracts, analyzes, and visualizes log data from digital collaboration tools to support students' evaluations of teamwork. By reviewing the visualization prior to rating their teammates, students can ground their evaluations in concrete evidence of each team member's behaviors. My approach centers students as active participants in interpreting and annotating their own data, thus also contributing knowledge of how students understand and value contributions within teams. This system is the culmination of a multi-stage investigation into how peer evaluations of teamwork can be augmented with log data from digital collaboration tools. I begin with a mixed-methods study examining current and potential uses of log data to support students' and instructors' evaluation practices. Through interviews and surveys, I find that while both groups already refer to log data to validate or challenge peer evaluations, this usage is often informal, retrospective, and shaped by uncertainty about how to interpret the data. Next, I conduct a between-subjects experiment to test whether reviewing log data during the peer evaluation process improves evaluation quality. The results show that log data increased consistency and perceived accuracy of the evaluations, motivating the design of a system that incorporates log data into the peer evaluation process. Grounded in the empirical findings of these two studies, I design and implement the data-augmented peer evaluation system through an iterative user-centered design process. The system captures multiple components of the teamwork: content, technical, and communication contributions in Google Drive, GitHub, and Slack. In the interface, the student reflects on an interactive timeline visualization of the team’s log data prior to evaluating their teammates, allowing them to ground their evaluations in the data. The student can also contextualize the data by marking contributions as valuable and tagging behaviors displayed by their teammates. I evaluate this system in a semester-long field study in a user interface design course. Results show that the data-augmented peer evaluation system helped students recall specific contributions, reflect on work patterns, and justify their ratings. At the same time, students expressed concerns about the representativeness of the data, with some even demonstrating resistance towards the system. Together, these contributions demonstrate the value of integrating log data into peer evaluations and advance knowledge of how data can be used not just to monitor teams, but to support fair and meaningful assessments of teamwork that can guide learning. This work also contributes design principles for creating data-driven evaluation systems that balance fairness, student agency, and meaningful reflection. This dissertation progresses toward a future in which every person working in a team can compose and receive meaningful feedback that helps them to feel valued for their contributions and learn effective teamwork behaviors.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.