Making learning visible

by | 18 Oct, 2018 | 0 comments

This post is part one of two, on approaches used in the subject Arguments, Evidence, and Intuition to provide opportunities for practice and formative feedback to students in the run up to a larger written assignment.

You’ve designed some tasks to support learning the key concepts and skills in your subject, and talked students through what they need to know. But how do you, and your students, check that they’ve understood?

Last semester, in the undergraduate elective Arguments, Evidence, and Intuition, I trialled an approach to try and make this learning more visible, in which we:

  • Ran (almost) weekly quizzes through google forms, where the content of the quiz was (a) directly related to the topics to be learnt that week, and (b) tied explicitly to a written assignment due mid-semester.
  • The quizzes were based on working with real data (more on this in a future post) and authentic examples – for example, looking at NSW housing data, just as they would need to do in their mid-semester submission.
  • And finally the last (optional) questions of the quiz asked three important ‘minute paper’ style questions:

Q1: Imagine you were writing your report based on this quiz dataset. Write a short paragraph that highlights some of the key claims you’d make to create a data story based on the analysis above.

Q2: Have you learnt anything new this week? Tell us about it!

Q3: Is there anything you’re still unsure about or would like us to discuss more? Or any other feedback on this week? Let us know!

These were reviewed each week, and used to calibrate future activity, and in some cases to contact individual students.

Q1: Imagine you were writing your report based on this quiz dataset. Write a short paragraph that highlights some of the key claims you’d make to create a data story based on the analysis above.

Analysis of the last question allowed me to draw out and provide whole-cohort feedback on examples of student writing. These were also used as reference exemplars when the students were asked to peer assess their preliminary analyses for their own assignments, drawing attention to three types of response: (1) ones that describe the data, but don’t interpret; (2) ones that provide commentary but without reference to the data; and (3) ones that effectively integrated data into critical interpretation.

We made use of these as part of a peer review guidance sheet, which provided the examples alongside feedback, to support students in thinking about their own data stories, and prompt whole class and peer discussion.

You can access the guidance sheet here: Writing with and about numbers.

Q2: Have you learnt anything new this week? Tell us about it! and Q3: Is there anything you’re still unsure about or would like us to discuss more? Or any other feedback on this week? Let us know!

These two questions in particular gave feedback on what was working to support learning, and where there might be misconceptions or gaps in knowledge that might need addressing.

For each of the topics covered far more students flagged that they had learned the content than that they required further support. Some students made general comments indicating information had been learned, or further support was required, and in a couple of cases students gave useful feedback regarding the accessibility of resources (small URLs on a projector, poor colour choices for accessibility).

These responses included perspectives that also led to interesting further follow-up. On the one hand, students noted the importance of ethical use of data, and the need to scrutinise data – great stuff! Others said things like: “how easy it is to manipulate [data] to depict what you want”, or “how to trick people with statistics”. These kinds of response prompted useful discussion in class about the need on the one hand to be critical, but also not to be so critical as to assume false equivalence across uses of statistics.

Other tools…

If you’re interested in taking a similar approach, but want something really simple, one method lots of academics use is an interactive poll (‘feedback in a flash’). You might also like to explore the previous post Are weekly interactive quizzes the answer? in which Natalie Krikowa talks through her approach to entry-quizzes with questions that:

  1. Review – what did we do last week?
  2. Raise – what are we doing this week?
  3. Reinforce – what should I be remembering?

Feature image by Karl JK Hedin.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

 
Share This