Tuesday, May 19, 2015

Last semester's lecture videos and test performance

I am teaching the same history class this semester as last. It is face to face and I do a lot of lecture. Last semester I used an Apple TV, an iPad Mini, and the Explain Everything app to record each of my lectures, which I made available to students for their review. This semester instead of doing the same I am giving my students access to last semester's lectures. I tell them the content is substantially the same as is what is important for the exams. The only difference are the jokes. I then wanted to see if there was any correlation between their performance on exams and whether they had looked at those online resources.

Fall 2014

In fall 2014 I had four exams each worth 80 points and I dropped the lowest score. On each exam the questions based on lecture were worth 40 points, and my class overall had the following scores for that part of each exam:
  • 56% = Exam 1 (72 total attempts)
  • 60% = Exam 2 (72 total attempts)
  • 70% = Exam 3 (71 total attempts)
  • 63% = Exam 4 (71 total attempts)
I provided access to lecture materials in two different ways: the slide deck I used for the presentation, which was made available at some point before lecture, and the recordings of those lectures. The recordings are on YouTube but embedded inside the course offering on our learning management system, so I can look at the access statistics for each of these items and tell which students had clicked on those links. I of course do not know whether they actually watched the videos, looked at the slide decks, or (best of all) downloaded those slide decks to help them take notes during lecture. At least the clicks can tell me something:

Students who clicked on at least one of the slide decks:

  • 58% = Exam 1 (38 out of 72 total attempts)
  • 58% = Exam 2 (27 out of 72 total attempts)
  • 75% = Exam 3 (22 out of 71 total attempts)
  • 63% = Exam 4 (12 out of 71 total attempts)

Students who clicked on at least one of the lecture videos:

  • 69% = Exam 1 (21 out of 72 total attempts)
  • 67% = Exam 2 (17 out of 72 total attempts)
  • 80% = Exam 3 (16 out of 71 total attempts)
  • 75% = Exam 4 (10 out of 71 attempts)
These statistics point to a correlation between watching the videos but apparently none with viewing the slide deck. One weakness of the data from the system is that it shows the last date a student clicked on one of those links but not the first. I included the data for those students whose last date was after the associated exam.

Spring 2015

A key difference between the semesters is that these students had access to last semester's lectures. Since recording the lecture is something I do not do this semester, I mention it less often in class. In this class there are four exams and each is worth 60 points. I do not drop the lowest score, but exams are part of a group of assignments that have 640 points possible but only 500 of which will be counted toward the course grade. On each exam the questions based on lecture were worth 30 points. As of the writing of this post only one exam had been given:
  • 76% = Exam 1 (58 total attempts)
As last semester, both the slide deck I used for each presentation and the recordings of those lectures embedded inside the course offering on our learning management system. The slide decks and recordings for the entire semester have been available since the beginning:

Students who clicked on at least one of the slide decks:

  • 83% = Exam 1 (17 out of 58 total attempts)

Students who clicked on at least one of the lecture videos:

  • 90% = Exam 1 (5 out of 58 total attempts)
After the first exam there is a similar relationship as there was last semester, but there were very few students who clicked on at least one video link. I plan on sharing this data with this semester's class and will encourage them to view the videos. I hope their performance overall improves and will follow up after each exam this semester.

Spring 2015 Update for Exam 2

I told my students about the correlation between video access and exam scores, shared what I discovered last semester, and showed them their data from Exam 1 this semester. I found the following for the lecture questions for Exam 2:

Overall class average:

  • 62% (61 total attempts)

Students who clicked on at least one of the slide decks:

  • 64% (31 out of 61 total attempts)

Students who clicked on at least one of the lecture videos:

  • 71% (14 out of 61 total attempts)
It is nice to see that more students viewed the slide deck and last semester's videos and also to see that the score boost continues.

Spring 2015 Update for Exam 3

I did not make a point to remind students about the availability of last semester's lectures except for one class meeting that I canceled and told my students to watch the lecture for the corresponding topic from last semester.

Overall class average:

  • 63% (58 total attempts)

Students who clicked on at least one of the slide decks:

  • 64% (27 out of 58 total attempts)

Students who clicked on at least one of the lecture videos:

  • 67% (14 out of 58 total attempts)
Again the score boost continues, though to a smaller degree than before. I did notice two students who had reviewed at least one slide deck and video who did not take exam 3. Both did very well on the first two exams and are likely taking exploiting the way the class grade is structured (the exams are among a group of assignments worth 640 points possible but only 500 of which count toward the class grade).

Special note for missed class:

I was away from campus for one lecture day this semester and decided to have students watch one of my recorded lectures from last semester. One of the exam questions covered that lecture, and all 58 students who took the exam completed it. Only 30 got it correct, so their score was worse for that lecture (52%) than for the other lectures (64% excluding that question). Fifteen students looked at either the slide deck or the video, and nine of them answered the question correctly (60%). Though the number of students who viewed the slide deck was roughly the same as it was for the other lectures for this exam (13 compared to an average of 15 for the others covered in this exam), there was a bigger difference in the number of students who viewed the video (10 compared to five).

Spring 2015 Update for Exam 4

Overall class average:

  • 72% (54 total attempts)

Students who clicked on at least one of the slide decks:

  • 76% (11 out of 54 total attempts)

Students who clicked on at least one of the lecture videos:

  • 70% (5 out of 58 total attempts)
This time the score boost for watching the video did not occur. Though the number of attempts dropped by just a few, the number of students looking at the slide decks and videos dropped almost to the level of the first exam before I shared with these students the correlation with their scores. The exams were among the assignments that had a cap on the number of earned points that could be applied toward the final grade, so it is possible that students who were near that maximum did not feel the need to study as hard for the lecture portion of the quiz. They wrote some of the other questions and in general did better on that portion of the exams.

This post was first published on 24 February 2015 and edited on 25 March, 30 April, and 19 May.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.