Video makes the coding star: teaching problem-solving in bite-sized chunks online

The 1980s band The Buggles topped the charts with Video Killed the Radio Star, lamenting the decline of the medium of radio due to the advent of television. Similarly, it has long been the concern of academics that providing lecture capture of their teaching will kill face-to-face lectures.

Students will not turn up; they’ll not watch the video; They’ll skip important parts of the lecture…These are all common warnings, while others argue that videoing lectures has no impact on attendance, with a more conciliatory view suggesting academics should embrace lecture capture despite this.

Whatever your views, despite a full or partial return to campus, student demand has seen online and recorded lectures accepted as the new norm across many courses.

The challenge of large-scale practical lab sessions

However, university isn’t all about lectures. Many subjects, especially STEM-based ones, feature practical solo and group activities, normally delivered on campus, often with specialized equipment or support. Computer science courses, for example, have programming modules that rely heavily on weekly practical work sessions, known as labs. Often lasting two hours, these are supported by the lecturer and as a result of large class sizes, a relative army of assembly.

The students work through practical tasks, referring to the provided final code solutions or asking for help if they get “stuck”. Practical labs are important learning junctions for students but not without drawbacks. Sometimes students are reluctant to ask for support, while others ask too readily, rather than working on the problem.

Management of the labs is difficult because of the one-to-one nature of the support required, often the same question is asked by many students, and responses from participants may not be consistent. The sessions can suffer from low attendance and completion rates. Due to large cohorts, the ability of the lecturer and support staff to answer students individually in short sessions is difficult.

Moreover, as with many practical activities, code development is a complex and active process, and simply providing the final code solution hides the evolving engineering and design decisions. When faced with a difficult problem and the completed solution, many students will complain “I don’t know where to start!”

The evolution of lab-based learning in lockdown

When lockdown was introduced, rather than simply replicate lab sessions online, the Queen’s University Belfast computer science department added videoed development of the coded solutions.

This broke the overall lab task into chunks. Lecturers provided pre-recorded videos for each chunk, with focused supplementary materials. A typical lab task timetable for an hour had three to four bite-size chunks, with one active coding video per chunk.

Each chunk was arranged to sequentially explain each step in the larger process to produce the final code-based solution to the overall task.

The videos included a running commentary, enabling an evolving explanation of the design decisions and alternatives as well as demonstrating the use of the complex development tool during the active implementation.

This was a major departure from students simply reviewing a static coded solution with no insight into how it had been reached. The new format proved very popular with students, with increased engagement and overall lab completions. Based on the feedback from students, lecturers, and analytics via the virtual learning environment (VLE) and video platform, the following guidelines were established:

Break the content down – bite-size is enough of a mouthful! In line with other findings on learning-based videos, the highest audience retention (AR) was achieved by videos that were about 10 to 15 minutes in length. On average, a typical lab task had three to four active coding videos, one per step (or chunk) in the overall lab task.

Show your face, include the presenter. Talking head format videos rather than just a disembodied voice format scoring highest in AR and overall views.

Students are tolerant. Recording videos takes time and there may be inevitable retakes if it goes badly wrong. But don’t plan for perfection because students are surprisingly tolerant of minor imperfections. Aim for a level of presentation equivalent to face-to-face lecture quality.

Keep taking a pulse. Actively look for feedback. Many hosting platforms provide detailed engagement measurement analytics such as audience retention, views, drop-offs, and completion of tasks. An excellent feedback add-on is to provide a simple thumbs up or down vote for each video or chunk.

Act on feedback. If the section isn’t working, find out why and act. If the video isn’t working, replace it. Shorter videos are easier to replace or edit than longer videos.

Top and tail. Provide an introductory overview video and outcome solution. Many students requested a big-picture road map to contextualise the overall task together with the final coded solution, for example, all chunks combined.

Provide supplementary materials. Anticipate questions and provide initial FAQs for each chunk but update every time you get a common question. Include and monitor a peer-group chat facility with questions answerable by lecturers and peers. Remaining historically live, these provided a good source for revision and a great peer-learning opportunity for the students. Include the relevant code snippets for each section.

Positive learning outcomes from video lab sessions and demos

Hosted on the VLE, ​​the lab videos and demos were favorably received. However, initially, one unnerving observation was that student questions slowed to a trickle during live sessions. After-session support via email fell by about 80 per cent. But “I am stuck” was replaced with more in-depth queries.

The attendance at online and eventually the on-campus lab sessions also dropped significantly, stabilizing at less than 40 percent of the cohort. Yet the completion rate for labs exercises as measured by the VLE at the end of the semester soared from 60 to 95 percent.

A common feedback theme from the students was the videos “allow me to complete the lab whenever it is practical”. Proof of engagement, at a convenient time for the students, but what about attainment? The final scores for practical assessments increased by more than 15 percent.

The video analytics and student feedback show two distinct patterns of video usage. The more competent students tended to attempt each task on their own and then reviewed the videos for comparison. Whereas other students used the video for a bump start and an occasional leg up when they were stuck. Rather than ask directly, they were going to the videos.

Students reported that the videos were useful for revision and aided understanding of each new element. The videos appear to better support different learning speeds and styles, lifestyles, and students with disabilities. What started as a lockdown experiment is now the norm, with student demand for overwhelming videos – any tasks that were not videoed were requested.

The practice has led to peer-reviewed publications and a National Teaching Award, and has been widely adopted within other disciplines. While there is a significant overhead in staff recording each video solution, it is time well spent with an excellent return on investment.

“We can’t rewind, we’ve gone too far” complained The Buggles, however for those learning practical tasks that’s a good thing. It would seem that “Video is making the programmer star”.

Aidan McGowan is senior lecturer in computer science at Queen’s University Belfast and a National Teaching Fellow.

Leave a Comment