More than twenty years ago, I lived in a university town in Canada and taught math and engineering courses. In each of these, there was always a concern about students cheating on assessments and exams. In practice, the tools we had to catch cheaters were largely manual. For example, a marker might notice overly similar answers. Also, exam procedures were much less strict than what I had experienced when I was a student, which did not help. Finally, the process of disciplining a student was extremely long and often unsatisfactory. Teachers learned through bitter experience that it was often not worth the effort.

My response to all this was to frustrate cheaters by design. I avoided multiple choice in favour of written answers, and I put more marks against the exam than course work. And if a student could not attend the scheduled exam I offered them an oral exam in a one to one session. It was amazing how many students declined that offer and could attend the exam after all.

Fast forward to 2020, my children are university age but got sent home at the start of the pandemic, so were writing their exams by computer. I overheard comments like “cheating is going to be rampant on this one”. I started to look into it, and it seems the problem has only become worse, with stories like this one. So, universities have to deal with cheaters in their communities. For reputational reasons they deal with it behind closed doors, or, more often than not, they do not do enough. The problem gets pushed onto the employers who hire university grads.

Technology has no doubt helped cheaters, but can technology catch cheaters? In 1936, Ronald Fisher showed statistically that Gregor Mendel’s experimental results were too good to be true, so there has got to be hope for us, over eighty years later.

Outlier detection finds unusual behaviour. So, if we assume that cheating is not in fact the norm, then there is the potential to find cheaters. But at the same time, we can anticipate that we could also find students who are struggling, or genuinely outstanding students. So, in common with using outlier detection to find financial fraud or expenses claims fraud, there are going to be false positives.

Our multiple choice dataset comes from the R statistical library.

The dataset is small, with only 143 test takers. There are 30 multiple choice questions. We have made minor changes to the dataset: we added a rownumber and totalled the marks. Here is the data profiling report:

multiple_choice_with_rownumber_profiling_report.html

Then we submitted our dataset for outlier detection. What we got back were five outliers, but none of them looked like cheaters. Their total scores were 0, 1, 3, 4, 8 respectively, out of 30! Instead, what we were seeing amounts mostly to poor data – one row was completely empty, and three contained nines in the answers, when the answers should have been A, B, C or D. One outlier did not have data issues, the one that scored 8/30. The reason code for this outlier was the test taker’s answer to question 2. This might have been a genuinely struggling student.

We have seen this before. The first screen for outliers yields more data quality issues than anything else. So, what we did was remove the outliers found in round one, and resubmitted our file. If you have a similar experience using the Penny Analytics outlier detection service, please contact us as we are happy to give you a “round two” for free.

Our “round two” outliers told a similar story, with some more numeric responses and a bunch of low scores (5, 7, 8, 10, 11, 13). So, in this dataset at least, we do not seem to have found cheaters, just poor data and students who need support.

The full results from the outlier detection are here:

multiple_choice_with_rownumber_round2_penny_outliers_2093_223

Do you work with data and need to find unusual records? Or are you responsible for data quality? At Penny Analytics, our outlier detection service users give us high marks because they can capture benefits quickly from AI and machine learning. If you haven’t done so already, register and get started with our free trial datasets.

Categories: Blogposts

Copyright © 2020 Penny Analytics Limited All rights reserved.