I have been trying to identify threshold concepts in Chemistry, specifically in the Structure and Bonding topic. In the process, I’ve read a variety of sources (reading list here, please feel free to suggest others) and created this tracker with a list of tricky concepts to teach explicitly, test and retrieve. In this post, I looked at the optimum spacing gap for retrieval (using this blog post about research by Cepeda et al).
Linda Needham suggested I measure pupils’ confidence about key concepts, so that I can use this to inform planning in the future. So, as part of a recent (online) homework, I questioned pupils in year 10 and 11 about concepts, and built a measure of confidence into the questions.
I marked the homework using simple colour coding (as I often do with google forms), which helped me to inform subsequent lessons: I was able to feed back on the concepts they found most tricky, and pick up on common misconceptions.
I used a very simple “scoring” method, which allowed me to obtain a numerical measure of confidence for each concept, and track it for each class. It also highlighted the concepts that I needed to re-visit.
I also want to understand which key concepts are most difficult to understand and retain.
So I grouped scores from each class together for every concept tested. I also tabulated how many correct/ incorrect answers there were for each question, to see if it gave a similar picture to the confidence measures.
This enabled me to list the concepts I tested in order of confidence.
We do make a difference!
I inadverently asked my year 10 triple scientists a question about simple molecules, when we haven’t covered this yet in class. You can see here the contrast between their answers and the answers from my other two groups (middle ability year 10s and year 11 triples).
Nearly all of my year 10 triples think that covalent bonds break when chlorine evaporates, compared to the majority of my middle-set year 10s (and year 11 triples), who correctly said that this isn’t the case. This is a common misconception, and it’s reassuring to see the change in conceptual understanding once I’ve taught them about it during lessons!
Knowledge, understanding, application
I realise that there is a difference between understanding something and being able to apply it to a new situation. It was evident from these questions that this is something that my middle set year 10s struggle with more than my triple scientists from the same year group. They did relatively well on the conceptual questions, but they did comparatively much worse in two questions where they had to either remember facts or apply understanding in a more complex way.
The first question was a very straightforward question about sub-atomic particles. My year 10 middle set (10D) had evidently forgotten the relative charges or identities of the particles, whereas my year 10/ 11 triples had retained this (10T/11T). The second question asked students to give the ratio of magnesium:chloride ions in a lattice. This was a question that many students found tricky (they had to identify that it was a 1:2 ratio, even though the Mg has a 2+ charge), but the middle set year 10s (10D) struggled more with this than the other groups (11T and 10T).
What am I testing?
One of the things I’ve had to really think about this term, when designing questions either for homeworks or hinge questions is what I’m actually trying to find out. For example, I asked whether this statement was true or false:
“Each molecule of sodium chloride contains one sodium ion and one chloride ion.”
I had asked this to see how many pupils would say: false (because sodium chloride doesn’t form molecules). But I realised with hindsight that it didn’t really test their understanding of key language. It was just a trick question. It was useful as a discussion point when we went through it in class, because this is a mistake that students often make, but it wasn’t a useful question to measure understanding.
It has been helpful to see which concepts students find most tricky for when I teach the topic in the future. I will also use this work to help me plan retrieval practice as I go through the year. I would like to see how students’ confidence changes as we go through the course, and to use this to plan future reviews.
I’d also like to see explicitly teaching these concepts from the start affects students’ understanding. It is interesting, for example, that none of my year 10s (double or triple) answered the question incorrectly about sodium ions being attracted to any negative ion (regardless of identity), whereas my year 11 triples did comparatively worse on this question. I was careful to introduce this concept at the same time as teaching ionic bonding with my year 10s, whereas my year 11s weren’t taught ionic bonding in this way last year. I did emphasise the point when I taught them about structure this year, but it seems to have had less effect when introduced later.
Other research (and the ASE conference)
The author of this article (one of the first I read on Threshold concepts) is speaking at the ASE conference next month, so I’m looking forward to hearing this.
I’ve also found this talk at the ASE conference by this researcher
I read his paper, and it has been useful to see how he uses confidence measures for conceptual understanding (in this case for reaction kinetics). His confidence grid is more complex than mine, and he also checks confidence for explanations, as well as for the conceptual understanding itself:
This is his scoring method (CF, CFC and CFW are similar to the approach I’ve taken):
“CF (mean confidence): sum of confidence ratings of students for a question divided by number of students.
CFC (confidence when correct): sum of confidence ratings of students for a question correctly answered divided by number of students who correctly answered.
CFW (confidence when wrong): sum of confidence ratings of students for a question incorrectly answered divided by number of students who incorrectly answered.
CDQ (confidence discrimination quotient): (CFC CFW)/ standard deviation across all confidence ratings. It can be interpreted in terms of the ability of the students to discriminate between what they know and what they do not know (Lundeberg et al., 2000).”
I’m not sure I could even start with a CDQ right now, but it was interesting to read his research, and I’m looking forward to hearing him speak about it next month.
For now, I’ll just use this to help me inform my next steps (and those of my students).