Using student feedback in course improvement

Four Purdue instructors recently shared their work with students to co-construct learning environments before and during COVID-19 remote teaching. Speaking via WebEx during the most recent meeting of the Purdue Active Learning Community of Practice (ALCoP), the topics varied from quick checks for understanding, feedback on instructional strategies, finding answers collaboratively, to fostering relatedness and engagement.

How’s it going?

Michael Mashtare, assistant professor in Agronomy and Environmental and Ecological Engineering, uses a variety of methods to gather feedback on student content understanding, and their learning experience in his Forest Soils and Soil Science classes. More than 80 percent of his approximately 200 students typically answer his questions in paper and electronic form, and the majority both agree and appreciate that these strategies give them a voice in the class. While he encourages the use of mid-semester feedback – typically using the same questions asked in end-of-semester evaluations – Mashtare also “takes the pulse” of students throughout the semester.

  • iClicker questions are used to check for content understanding, but also about how students are managing the course. For example, a multiple-choice question about how much time they are spending completing a lab provides insight into students’ approach to the work, and provides a place to begin a dialogue. 
Students in Mashtare's class use iClickers to answer questions like this one: "How much time did you spend on last week's class?"
Example of slide used in Mashtare's class.
  • Prior to exams, Mashtare used note cards to gather feedback on topics that he then included in exam reviews. He also asked how they were experiencing instructional strategies. On the exam given prior to spring break 2020, he asked students to answer questions about their Internet connectivity and computer access. About 10 percent of his students said they did not have access to reliable high-speed Internet or devices capable of streaming video or using advanced Blackboard LMS functions. Based on those results, Mashtare knew he needed to provide low-tech options. His online plan was handed out and reviewed with students during the discussion/recitation period the Friday before they left campus. It included providing PDFs of lecture slides and transcripts of all lab videos, lectures, and recitations. Although materials were hosted on his LMS, Mashtare’s team also made significant use of email both to share materials and for students to share their work back. This allowed a greater use of student cell phones to communicate.

  • After courses moved to emergency remote teaching, Mashtare sent students bi-weekly “sanity checks” on how the new format was working. Based on that feedback, his team began sending email reminders on upcoming deadlines, polices, and other reminders.

Improving teams, course materials

Cézanne Elias, clinical assistant professor, Human Development & Family Studies, solicits informal and formal feedback before, during, and after her “Orientation to Current Issues in HDFS” course. This feedback has influenced the course significantly over time, making it a more valuable experience for new students and their student mentors.

The two groups are each enrolled in two one-credit courses, one for new HDFS students and the other for their more advanced mentors, who take an active role in developing and presenting curriculum designed to help their mentees connect to Purdue’s HDFS community and profession. Feedback influences everything from the process used to match each mentor to their small groups of mentees to the mentor training, topics, speakers, and assignments.

“[Mentor] students do such a good job of seeing and understanding from a student perspective,” Elia said. “And they do a great job of helping me understand the barriers to certain assignments or the applicability of specific assignments.”

Recent feedback helped Elias create additional leadership roles that allowed the mentors to connect better with each other, as well as their groups. These connections helped during emergency remote teaching, when the groups figured out themselves how to complete a pre-assigned out-of-class assignment by using synchronous virtual meet-ups. There was 100 percent participation in this assignment despite the physical distancing COVID required.

“Something that surprised me through students’ feedback is just how passionate they are about the major and about the department,” Elias said. “It’s exciting how much enthusiasm breeds enthusiasm.”

Students as partners in learning

Cara Putman, a lecturer in Krannert School of Management, uses the Socratic method to get at what students understand and how they are doing with the material, but also to “pull back the curtain” on learning so it becomes a skill that continues to develop after they pass the course.

“In our areas of study, learning is an ongoing process,” Putman said. “If I’m giving students the impression that they can learn everything they need to know in this 16-week class, then I am not serving them well.”

Like many law instructors, she uses dialogic questioning in her courses as a way to include student voice, and collaborate with students in finding answers. If she does not know the answer to a question asked in class, students are welcome to email the question to her and even remind her of it during the next class. Even better, she will stop class and work with students for a few minutes to think through how to find the answer.

“I’ll ask, ‘How would we find this information?’ I try to really invite them into that process.”

As an IMPACT Fellow, Putman was introduced to strategies such as “minute tickets” to see if students are connecting the dots between topics discussed in class, and seeing those topics as relevant now and in their future. She sorts the responses into themes and uses them as collaborative questions to work on the next class and to create study guides. In the future, Putman plans to continue leveraging this type of feedback into activities during class, such as asking students to work on answers collaboratively.

Putman also reminds students that she values the end-of-semester evaluations and reads them all.

“When we let students know that we value their feedback, we are telling them that we value them,” Putman said. 

Student feedback as research

Elizabeth Karcher, Assistant Professor of Animal Sciences, has published several articles in the last few year based on student feedback that was used to make evidence-based decisions to increase student interest and motivation in her large Introduction to Animal Agriculture course.

The laboratory course typically serves approximately 200 first- and second-year students. When Karcher first taught the course in Fall 2017, it was taught in a large, fixed-seat lecture hall, and included tours of area farms. Karcher wanted to promote interest in animal science by making better use of hands-on activities and other new learning elements, and to take advantage of the proximity of campus farms. In Fall 2017, the location of her course moved to WALC, which allowed for increased collaborative team activities.

“For many of my students, I’m the first person they’re going to see when they come into the animal sciences major,” Karcher said. “So my goal in the course is not only to get the content knowledge across, but to really make sure that I’m engaging the students in a way to promote curiosity, interest, [and] motivation.”

Initially, Karcher used feedback from former students to develop hands-on activities such as case studies, think-pair-share, iClicker questions, and laboratory stations. She also introduced the use of written critical reflections after labs to help students relate better to what they experienced, as well as provide feedback on the stations and activities. Feedback showed these changes stimulated more student interest, as outlined in this article.

She continues to include student feedback to make changes in the course and publish on the subsequent improvements. This includes developing an understanding of how video lecture, lab stations, and case study activities affect student engagement. Based on that study, published here, Karcher’s instructional team worked to make the lab stations even more hands-on. Another article, in review, identifies improvement in student interest through the use of utility value reflections in the labs. Follow this link to Karcher’s course portfolio on ANSC 10200.

 ***

For assistance in soliciting and analyzing student feedback for program improvement and/or publication, contact the Center for Instructional Excellence by emailing innovativelearningteam@purdue.edu.