Debbie on August 13th, 2015
Matt Dyki from the Department of Accounting has started using LMS tests as a way of allowing students to conduct peer review of their project team members. Students use the test to answer a series of questions about each team member. The student responses can be viewed by staff, and the collected data can be analysed and interpreted, allowing “free riders” to be identified and their group project mark adjusted accordingly.
For several years Matt has been coordinator of ACCT10001: Accounting Reports and Analysis. This is a large subject with 1200 students in SM1 and approximately 1700 per annum. The subject is an introductory accounting subject for accounting majors and a terminal subject for non-accounting majors. In SM2-2012 the subject underwent a major re-write and students now participate in a group assignment as part of their assessment. While the learning outcomes from participating in group work are highly desirable, there have been a number of complaints by students that their peers were not doing the group work, leaving others to carry the burden.
The group assignment is in two parts. First, groups are asked to undertake a financial statement analysis of two hypothetical companies in the same industry, with the results provided to students 1 hour after the submission deadline. Students then write a business report to determine which company their client should buy.
To complete the assignment students are asked to find colleagues and form groups of 3-4 people. Students are not limited to selecting group members from within their tutorial. Last semester the 320 groups were registered via the Faculty of Business and Economics (FBE) Assignment Tool. The use of the FBE Assignment Tool, rather than the LMS signup-groups tool, was to minimise administration issues of students accidentally signing up to the wrong group. This practice will have to be reassessed due to the FBE Assignment Tool being decommissioned.
Since the course re-write Matt has investigated a range of options for gathering and analysing feedback from students regarding their group members. These included the FBE-developed tool Groupworks (good, but no longer supported by FBE), PRAZE (not really suitable for identifying the “free rider” issue) and LMS surveys (anonymous submissions and lack of scores complicated the data received meaning that Matt had to make a judgement-call on the level of penalty to be applied). One positive, and perhaps surprising, thing about the LMS surveys was that many students who were ‘free-riding’ self-identified.
Last semester Matt used an LMS test to gather the group member data. Questions within the test allow students to nominate which group they belong to, who their group members are and use an opinion-scale to rate each members’ participation. For the first time since Matt has been gathering this type of data and adjusting the grade of students who have not fully participated, there were no complaints from students who had been ‘marked down’.
Tests overcame the problems experienced when using surveys: by automatically identify the person submitting the information and applying scores to the opinion-scale questions. This last one allowed Matt to easily gather all data about individual students and from that determine those who had been identified by their peers (average score) as not participating fully in the group assignment. These identified students were then contacted by Matt via email and advised that their participation in the group project was being assessed. They were asked to prove their level of participation by sending in details of the group work they had contributed to. No student responded to this request, so each was penalised.
So how did Matt take the data from the LMS test and analyse it? The test download resulted in a scary 200K cells of data. Judicious deleting of unneeded columns reduced this to a more manageable 40k cells.
Only three formula were used to prepare the data: VLOOKUP, COUNTIF and SUMIF, plus a division to determine the average peer mark for each student. Add in some conditional formatting and you end up with a colour coded record which is easy to scan to determine the students with the lower scores.
One issue that remains: Matt still has problems preventing students from using Cut & Paste, which ends up polluting test download data with html codes. Perhaps in future students will read the repeated requests to not cut and paste their responses.
If anyone is interested in getting a copy of the questions used in the test, you can import this test into your playpen: https://app.lms.unimelb.edu.au/bbcswebdav/xid-17439093_3
Similarly, if anyone would like a copy of the template spread sheet Matt used, he has kindly made this available: https://app.lms.unimelb.edu.au/bbcswebdav/xid-17442038_3
And finally, if anyone would like to chat to Matt Dyki about how he manipulated the data from the test, he would be happy to discuss this: firstname.lastname@example.org