Withdraw
Loading…
Towards more effective testing and debugging education in computer science courses
Butler, Liia
Loading…
Permalink
https://hdl.handle.net/2142/129210
Description
- Title
- Towards more effective testing and debugging education in computer science courses
- Author(s)
- Butler, Liia
- Issue Date
- 2025-04-18
- Director of Research (if dissertation) or Advisor (if thesis)
- Herman, Geoffrey L
- Doctoral Committee Chair(s)
- Herman, Geoffrey L
- Committee Member(s)
- Lewis, Colleen M
- Zilles, Craig
- Heckman, Sarah
- Department of Study
- Computer Science
- Discipline
- Computer Science
- Degree Granting Institution
- University of Illinois Urbana-Champaign
- Degree Name
- Ph.D.
- Degree Level
- Dissertation
- Keyword(s)
- debugging
- computing education
- programming education
- Abstract
- Testing and debugging are essential skills for ensuring code correctness in programming, yet are not often emphasized in computer science instruction. Further, while autograders can provide feedback to students quickly, students can become reliant on the autograder as the primary means for determining correctness of their code, rather than developing their own testing and debugging skills. In this dissertation, I first present a policy to help promote only correct coding submissions to the autograder. We were able to see a modest increase in lab success measures with respect to our goals. I next present a debugging exercise inspired by the pedagogical technique, Interactive Lecture Demonstrations (ILDs). During ILDs, students predict a demonstration's result, experience the demonstration, and then reflect on their experience. I adapted this process for debugging (ILDBug) by having students look at example program outputs and predict the bug, then they experience debugging by tracing the code and trying to fix the bug, and finally students reflect on their debugging process. We engaged students in a series of three ILDBug exercises during a lab section of an introductory programming course for non-CS, engineering majors. To evaluate whether my exercises were improving students’ debugging skills, I introduced a cross-over study design with three populations: each group of students completed the same three exercises but in a different order. I then compared how students performed on each exercise when it was their first exercise or their last exercise. We graded students' predictions for accuracy using a binary score and measured how long students took to fix the bugs using click-stream logs. Students generally fixed bugs faster after completing two ILDBug exercises. Students improved at predicting bugs that were moderately difficult to identify. With the promise of ILDBug after our initial study, our final study evaluates the effectiveness of ILDBug. I created two different versions of a debugging lab using the ILDBug exercise design. One focused more on hard-coding bugs, the other focused on iteration bugs. I then present two debugging questions in a future quiz, one hard-coding bug and one loop bug. I tested to see if students from one lab performed better on their respective quiz bug than students who had the other lab for both questions. There were no significant differences in performance between students who had one lab version versus another. I discuss my findings and how they align with prior literature. I also discuss future research directions of the role ILDBug can play in explicit debugging instruction.
- Graduation Semester
- 2025-05
- Type of Resource
- Thesis
- Handle URL
- https://hdl.handle.net/2142/129210
- Copyright and License Information
- Copyright 2025 Liia Butler
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…