{% set baseFontFamily = "" %} /* Add the font family you wish to use. You may need to import it above. */

{% set headerFontFamily = "" %} /* This affects only headers on the site. Add the font family you wish to use. You may need to import it above. */

{% set textColor = "#565656" %} /* This sets the universal color of dark text on the site */

{% set pageCenter = "1100px" %} /* This sets the width of the website */

{% set headerType = "fixed" %} /* To make this a fixed header, change the value to "fixed" - otherwise, set it to "static" */

{% set lightGreyColor = "#f7f7f7" %} /* This affects all grey background sections */

{% set baseFontWeight = "" %} /* More than likely, you will use one of these values (higher = bolder): 300, 400, 700, 900 */

{% set headerFontWeight = "" %} /* For Headers; More than likely, you will use one of these values (higher = bolder): 300, 400, 700, 900 */

{% set buttonRadius = '40px' %} /* "0" for square edges, "10px" for rounded edges, "40px" for pill shape; This will change all buttons */

After you have updated your stylesheet, make sure you turn this module off

Putting students in charge of their behaviour and progress

by Terry Freedman on March 16, 2018

Programs like Emerge, Xporter and Groupcall's Analytics are very good at collating data, and alerting you to trends, oddities and possible impending disasters. However, from the student's point of view, it is a case of 'being done to' rather than being in charge. What if you were able to reverse this dynamic?

Some research has been taking place in the field of what has been called 'student-led analytics'. In this scenario, students are given report cards or score cards that detail aspects of their progress and behaviour.  It is also possible to include comparative metrics in such reports.

For example, the score card might say "Your attendance has been 96% so far this term. The average attendance for your class has been 98%."

If your school or schools uses online resources, such as digitised and online textbooks, you might even be able to include metrics pertaining to their progress through books, such as "You have covered 36% of the materials so far." If your VLE permits students to rate the materials or make recommendations, you can also include Amazon-style comments such as "23 other people in your group found this resource useful."

This kind of report effectively puts students in the driving seat. In fact, you might even go a stage further, and ask students to decide what they would like feedback on in their score cards or reports. This has been the approach of an experiment at the University of Edinburgh:

"The ‘Learning Analytics Report Card’ (LARC) captures data from an individual student’s course-related activity, and presents a summary of their academic progress in textual and visual form. However, rather than manifesting through hidden and inaccessible institutional data aggregation and analysis, the LARC offers students an opportunity to play with their data; to choose what is included or excluded, when the report is generated, and how it might be presented."

It's worth knowing about this because it is often the case that what universities try out today gradually works its way down the educational system in a less complex form.

According to the Open University's Innovating Pedagogy report of 2017,

"Student-led learning analytics allow learners to share their goals with teachers. They provide opportunities to discuss how these goals might be reached, or adjusted, as priorities shift. Unlike previous learning analytics applications that are focussed on learning goals set by teachers, student-led analytics allow individuals to identify their own learning goals and use these to develop their own routes to success."

The report makes an interesting observation. It states that different learners may have different goals. For example, a student who is getting low grades may be interested in particular topics and questions, but not that bothered about how they fare overall. On the other hand, some students may have little interest in anything apart from gaining good grades in the final examinations. The report goes on to state:

"This range of motivations and study patterns means that both institutional analytics focused on grades and learning analytics that predict student outcomes based on previous behaviour and engagement may wrongly identify which learners are at risk. They may also fail to encourage others who want to be pushed to the limit."

Clearly, the approach of providing students with such personalised feedback based on criteria of their own choosing would not be suitable for primary schools, and may be difficult to implement in its entirety for a whole secondary school. But score cards based on the data which schools are collecting and analysing anyway should be quite feasible.

Some teachers have successfully tried a version of this approach in the area of marking. Giving students other students' work, or even their own, and asking them to grade it according to a sheet of criteria gives them insight into what is being looked for, and therefore what they need to pay attention to in future. A score card of the sort discussed here would simply be a version of that.

If you are concerned about the difficulty of such an undertaking, or whether the benefits would justify the costs in terms of time and effort, why not set up a pilot study involving a group of, say, 20 student volunteers? You have the tools to collect, collate and present the data. Why not try handing the data over to students to see how they respond to it?


Innovating Pedagogy 2017 pp 29-31

Innovating Pedagogy 2016 pp 32-34


Topics: Groupcall Emerge, Groupcall Xporter, Groupcall Analytics