Certain University of South Africa (Unisa) students writing online assessments during the October / November exam period have been informed that they will need to use a piece of software called IRIS.

IRIS, or Intelligent Remote Invigilation System, is software that records audio, video and computer activity during online exams. It then uses machine learning to “automatically [flag] potential academic dishonesty”.

Students who have been instructed to use IRIS have expressed worries about the system, especially in the areas of data use, safety, and accuracy.

We have fielded these questions to Unisa and its Media Affairs, Department: Institutional Advancement has replied to us with the answers below.

It makes for some very interesting reading both for students and those interested in this kind of monitoring software.

The questions of accuracy and bias are especially poignant. Unisa seems rather sure of IRIS in these regards.

Hypertext: IRIS uploads audio and video which uses data. While Unisa is in the process of providing 10GB of anytime data to students, exams which make use of IRIS have the potential of using considerably more data and may deplete the allocations of certain students.

Unisa: Indeed, the technology does require data, however, the allocation provided by the institution is more than sufficient for the students to complete all their assessments. It should be noted that IRIS is only to be used in the assessment of certain modules offered in the College of Science, Engineering and Technology (CSET) and only 400 MB of data will be required for the longest assessment of three hours. This means that if a student has a maximum of 10 proctored assessments, only 4GB of the 10GB allocated with be dedicated to this process. Most of the students are not writing that many examinations.

Hypertext: How is the private data – again complete audio and video recordings – of students going to be handled? Has Unisa made sure that this process is compliant with the Protection of Personal Information Act (POPI Act)?

Unisa: The data is securely protected and IRIS does not sell or use the data. IRIS Information Security has been reviewed and verified by the Unisa POPIA office as compliant and has been found in good standing with regards to the Act. Their information security systems are also internationally recognised.

Hypertext: IRIS only works on computers, not phones or tablets. Does this not exclude certain students?

Unisa: The online invigilation does require a computer and will not run well on phones or tablets. Such technology has not been expanded to cell phones or tablets yet although there are developments to support these devices in the future. Students who do not have appropriate devices for IRIS are given another examination opportunity by the institution.

Hypertext: IRIS proudly claims that it uses facial and eye tracking combined with machine learning to “detect academic dishonesty”. By Unisa utilising this software it can be seen as an endorsement that this technology works as intended. What proof or research can Unisa provide to justify its use?

Unisa: Academics at Unisa have been given the opportunity to test the technology to verify that it does indeed detect behaviour that is usually related to academic dishonesty. The institution does not use the collected data blindly, however academics review the recordings for verification purposes. This means that there is still human intervention.

Hypertext: Carrying on from the previous question: can Unisa or IRIS claim that the machine learning shows no bias? Bias in machine learning has been the subject of debate recently especially around race.

Unisa: The possibilities of machine learning might be debated by scholars, however it should be noted that Unisa academics do not use the information blindly. Academics still have the opportunity to review the data and confirm the generated reports. Such an approach eliminates the possibility of bias.