Teaching basic lab skills
for research computing

Thinking About Teaching

A little over a year ago, we blogged about jugyokenkyu, or "lesson study", a bucket of practices that Japanese teachers use to hone their craft, from observing each other at work to discussing the lesson afterward to studying curriculum materials with colleagues. Getting the Software Carpentry Foundation up and running almost immediately pushed that aside, but now that the SCF is up and running, it's time to return to the subject. Discussion of how teaching practices are transferred is part of that; so are two other developments this week.

First, Victor Eijkhout has been thinking about how to teach MPI. In particular, he has been thinking about how the order in which concepts are usually introduced can mislead or confuse, and about a different order that leads more naturally to proficient use. We're not going to include MPI in Software Carpentry any time soon (if ever), but I hope that the next major revision of our lessons will include this kind of design rationale.

Second, Azalee Bostroem surveyed people who have taught our current Python lesson to find out what they're actually doing and how it has been going. 46 people responded, almost all of them instructors, and the results are fascinating. Here are some of the key findings, and my comments:

60% of our lessons are being taught to novices.

Here, "novice" means "someone who has never programmed before". (The survey defined "intermediate" as someone who had in a language other than Python, and "advanced" as someone who had already used Python.) When we re-design this lesson, we could well wind up creating different versions for these audiences, since someone who's learning Python as a second language will be able to move faster and farther than someone who is new to ideas like "variable", "list", and "function".

53% of respondents would like an introduction to variables, data types, functions, and libraries before jumping into the inflammation data example; only 28% would not.

Related to this, 70% of respondents said that they carefully explained each part of the first episode of the Python lesson, which is another indication that we need to warm up rather than diving straight in.

Only 1/3 of respondents got through the whole Python lesson.

Which means we need to cut material as well as telling instructors how far people usually get. (We've asked in the past for people to report times for lessons, but returns have been slim.)

Over 80% of the respondents are teaching our materials with little or no modification, but 22% are using an older version.

Based on a handful of conversations, I think people use older material because (a) they stick to whatever lesson they first taught, and/or (b) they don't like the NumPy-early approach of the current lesson. I don't know what we can do about (a), but our "Python as a first language" lesson should start with simpler tasks.

The defensive programming section which is skipped by over 85% of the people who answered the survey, so we should move this material elsewhere.

I (reluctantly) agree, at least for people who are learning programming for the first time. As with testing, there's no point putting scarce classroom or maintenance time into something that doesn't stick.

82% of the people answering the survey are teaching out of the Jupyter Notebook.

I was surprised by that, but I now wonder if it explains why it takes so long to get through material with novices. On the one hand, the notebook is more like the graphical user interfaces they are used to. On the other hand, it's different in many ways as well, which means they could be facing the double cognitive load of a new programming language and a new kind of editing interface. As in the debate over whether phonics or whole language is the best way to teach children to read, what we need is a proper, unbiased study.

The biggest lesson in all of this is that being able to pool our experiences is the second-greatest benefit of teaching in the large. (The greatest is having someone who can genuinely sympathize when something has gone wrong...) It would take even our most active instructors several years to learn this much about our lessons; comparing notes can help us all improve faster.

We would like even more data: if you have taught Python for us and have three minutes to spare, please fill in the survey. And if you can give us half an hour, we'd be grateful for help sorting and summarizing the comments we've had on our Python lessons in debriefing sessions—please get in touch if you can spare some time.