Creating fraud-resistant exams

Creating fraud-resistant exams

Fraud prevention measures should not hinder student performance during the assessment, respect the privacy of students, and the other eight requirements for remote assessment. A summary of the guidelines that your assessment should follow can be found on the Remote Assessment Page.

Use the scheme and tables below to decide which fraud prevention and detection measures you need for your assessment situation.

Part 1: Recommended fraud prevention measures during the assessment per assessment type
Part 2: All fraud prevention measures during the assessment: pros and cons
Part 3: Fraud prevention measures during the course
Part 4: Fraud detection during grading
Part 5: Implementation of fraud prevention measures

1. Recommended fraud prevention measures per assessment type

For details on the pros and cons of each fraud prevention type, see part 2, for implementation, follow the links in the table or see part 5.

Fraud prevention measures are divided into four categories (left column). The five assessment types (columns) are divided into into open-book and closed-book assessments. These measures are:

  1. Question construction measures to make sure that students cannot Google the answer, or copy-paste it from someone else.
  2. Giving each student a unique exam so they cannot share answers.
  3. Authenticity/identity measures to ensure that the student who wrote the answers is the student who receives the grade.
  4. Measures that influence the course of the assessment.

Open book
(no declarative knowledge)
Closed book
(declarative knowledge)
Exam type Projects /
larger assignments
Remote exams
(using assignment tool)
Remote exams
(using digital exam tool)
Oral
exams
Online proctored
exams
A. Question construction Open-ended questions (strongly recommended)
No factual questions (strongly recommended)
No recycling of existing questions
(required)
No standard questions that can be googled (required)
B. Unique exam for each student Parametrization of numerical
assignments / questions
(required for numerical answers)
Parametrization of numerical
assignments / questions
(required for numerical answers)
Different versions per question (random questions from pool): small variations, or different but equivalent questions (in difficulty and topic)
(advised for open-ended questions, required for all closed-ended questions, e.g. multiple choice)
Different exam version (e.g. exam version A and B, recommended for exams in Brightspace Assignments)
Variation in cases and datasets (recommended for case studies, computational assignments, etc.)
C. Authenticity / identity Oral authenticity checks (immediately after the exam, recommended for written open-book exams)
Identity check before oral exam (required)
Login with netID (required, for oral exams: applicable in case of MS Teams)
D. Course of assessment Honor’s pledge (recommended)
Random order of questions (make sure the order makes sense) Random order of questions (make sure the order makes sense)
Limited time slots
(minimal 30 minutes per slot)

2. Fraud prevention during the assessment: pros and cons

Use the tables below to decide which fraud prevention measures are recommended or required for your type of questions. The measures are divided into four categories. Per measure the effectiveness and the stress inducement are indicated in the right columns. These should be balanced.

  1. Question construction measures to make sure that students cannot Google the answer, or copy-paste it from someone else
  2. Giving each student a unique exam so they cannot share answers
  3. Authenticity/identity measures to ensure that the student who wrote the answers is the student who receives the grade
  4. Measures that influence the course of the assessment

At the bottom of this page (chapter 5), you can find a more detailed description of the various fraud prevention measures.

Measure Description Required/ advised
Reduces risk of … (effectiveness) Stress inducement during exam (– = a lot)
Open-ended questions/ assignments only No multiple choice questions etc. Strongly advised Sharing answers undetectably (+) 0 (if prepared)
No factual questions Required for all open-book assessments Copy-pasting from book (++) 0
No recycling of existing questions Reformulate previous questions completely and use different parameter values, names and characteristics Required in all cases Googling the answer, or copy-pasting from last year (++) 0
No standard questions that can be googled Required in all cases Googling the answer (++) 0

 

Measure Description Required/ advised in case of Reduces risk of … (effectiveness) Stress inducement during exam (– = a lot)
Parametrizing numerical assignments / questions For numerical questions, give students different variable values within a physical relevant range. Required for numerical questions, Möbius and Ans Delft accommodate this Sharing answers (++) 0
Different version per question: small variations of the question  

In order to make it more difficult for students to share answers of questions (specifically answers of knowledge questions), you can give each student different exam questions, chosen from a question pool with equivalent questions.

Advised for open questions,

required for closed-ended questions (MCQ, true-false, etc.)

Sharing answers, cooperating (+)

Sharing answers, cooperating (++)

0

0

Different version per question: different questions that are equivalent in difficulty and topic
Different versions on exam level (exam versions) Make sure not to show version names Advised as low-tech version of previous two measures Sharing answers, cooperating (+) 0
Different cases or datasets Questions can be the same, not the answer to the assignment / questions. Advised for case studies, computational assignments, etc. Sharing answers (+) 0

 

Measure Description Required/ advised in case of Reduces risk of … (effectiveness) Stress inducement during exam
(– = a lot)
Oral authenticity checks (after the exam) Videocalls during which examiner assesses whether randomly selected student can explain why/how they produced their solution Advised Not being able to explain own answers by the end of the exam (+) 0
Identity check before oral exam Cannot be recorded! Required for all oral exams Someone else doing the oral exam (++) 0
Login with netID Required, advised for oral exams
Online proctoring (only as a last resort) Official surveillance via camera, screen capture, key log and microphone, by US-based company employees Advised for closed questions, if no alternative is possible, and closed-book is really necessary (see decision tree). Use of Möbius required. Permission by Board of Examiners is required. Communication, use of resources (+, not watertight, additional measures needed)
‘Poor man’s proctoring’ (FORBIDDEN) Asking students to turn on their camera during the exam Never! This is NOT allowed!
Online proctoring is the only legally permitted video surveillance.
Communication, use of resources (0, easy to pass by)
Asking students to upload a picture, selfie, their campus ID, etc. (FORBIDDEN) During, before or after the assessment Never! This is NOT allowed! Identity fraud. (0) In case of identity fraud, the other student could leave the room temporarily. 0/–
(if during exams, students are disturbed)
Measure Description Required/ advised in case of Reduces risk of … (effectiveness) Stress inducement during exam
(– = a lot)
Honor’s pledge Students promise explicitly not to accept or give help. Advised Fraud in general (0/+). Make students aware of what is not allowed. 0
Random order of questions Randomize the order of questions. Only if the order of questions still makes sense and does not have illogical jumps. Sharing answers, cooperating (0/+): students might communicate the answers to the first question 0 (- if question order is illogical)
Limited time slots Exam is split into partial exams; timeslots should last preferably 1 hour or longer. Maximum of 4 timeslots per exam. Advised for larger groups in case of risk of cooperation Sharing answers, cooperating (+) -/– (the shorter the timeslot, the worse)
Prevent students from reattempting previous questions Students have to answer a specific subquestion before they can continue. If they do not know the answer, they either wait or fill in an incorrect answer. Never Sharing answers, cooperating (+)

(you don’t test if students mastered the LOs)

Requiring students to answer multiple times if incorrect after the first try Compensates for the lack of partial points in automatic grading. Students have to answer a specific (sub)question before they can continue. If they do not know the answer, they either wait or fill in an incorrect answer multiple times. Never None (0)
(students only receive negative feedback and cannot skip the question)
Asking too many questions Asking more questions than time allows, so students do not have time to cheat Never Having time to cooperate (?, they could still divide the questions)

3. Fraud prevention measures during the course

  1. Make sure that students are prepared for the exam, so they do not feel the need to commit fraud. Have them practice with representative questions/assignments and give them feedback (e.g. model answers).
  2. Inform them of what fraud is, what the consequences are, and which measures you have taken. More information for students on fraud can be found here.

4. Fraud detection during grading: What do you need to do to detect fraud?

Use plagiarism scan software to detect similarities between students. In some faculties, the Board of Examiners has made this mandatory in the Rules & Guidelines. Similarities do not automatically imply fraudulent behaviour, so manual labour is still needed, see below.

The software does not indicate who copied who and only indicates the likeliness in the student’s work that was delivered after the first delivering student, so both students will be under suspicion of fraud. Studying the similarity report will help you to determine whether the similarity is likely a suspicion of fraud, in which case you report your suspicion to the Board of Examiners, including the similarity report.

Look for the following patterns:

  • Large similarities in answers to open questions. For example, similar phrasing, by hand or using a plagiarism checker. Formulation of answers to (longer) open questions cannot be exactly the same for multiple students.
  • Similar, strange mistakes in students’ answers. It is very unlikely that students make the exact same, uncommon mistake. In case of multiple-choice questions, statistical tools could be used to calculate the probability that similar wrong answers are purely coincidental (contact the educational advisor in your faculty, or the TU Delft learning developers of TLS via Teaching & Learning Support for more information). If you use Ans, you can keep track of strange errors during grading.
  • Logical explanation of similarities: Check whether the similarities can be explained by the use of the same resource (paper, websites, etc.), in which case it may be a case of plagiarism or no fraud at all if the resource was allowed and was allowed correctly.
  • Use of parameters from another version of the exam. For example, students who were working together and used the parameter of the other student.
  • Time pattern. Especially in case of Möbius or Ans Delft exams, check in the log-file if groups of students submitted answers to several questions around the same time for a consistent period of time. If so, check if the answers were similar.
  • Free riding in group work. This will in general simply lead to a fail grade, but could be considered fraud. This may show if students score extremely low on an individual assessment in the course.
  • Handwritten exams. If available, do a random check in which you compare the handwriting with a previous assessment. Furthermore, check for identical handwriting, especially in the case of other suspicions.

Be as unbiased as possible and make sure that random checks are really random and for example not based on your experience with a student in class. Use for example a random generator to pick students to run checks.

If you suspect fraud with your assessment, you have to email your fraud suspicion report to the Board of Examiners (BoE, Dutch: examencommissie) that represents your course (according to the study guide). In case you suspect fraud after the assessment, it depends on the faculty if your BoE will ask you to inform your student of the suspected fraud, or if your BoE will take care of that. The BoE may have a checklist with all information that needs to be included in the report that you send them. You can find more information on the procedure and rules of article 7a, section 4 of the Rules and Guidelines of your BoE.

5. Implementation of fraud prevention measures

In order to give students grades that reliably represent how well they master the learning objectives (LOs) of a course, we must:

  1. Assess these learning objectives (and nothing more).
  2. Make sure that students deliver work that reflects their own level of  LO mastering. In other words, we must prevent and detect fraud, and make sure that students can perform optimally during the exam.

Exam construction measures

There are four ways to prevent fraud when developing the test:

Transform your exam to an open book exam instead of a closed-book exam. Questions on facts (declarative knowledge) are easy for students to look-up during a remote exam. It is difficult to enforce students not to use their books or ‘cheat sheets’ during the exam. Therefore, do not ask for factual knowledge, but aim at questions at higher levels of Bloom (Bloom, 1956, click here for the TU Delft adaptation for engineering education), which students can only answer if they master the learning objectives. The questions should be constructively aligned with the learning activities and learning objectives.

Authorize students to use all help available and provide them a list of sources that they are suggested to have available.

In case a learning objective requires reproduction of factual knowledge, consider whether this factual knowledge is crucial (for example in their professional lives) or not. In case a learning objective cannot be tested, discuss with the Board of Examiners and Programme Director whether this is acceptable.

In case you need to ask factual questions and an open book exam is not a possibility, you could consider oral examinations (depending on the number of students), or online proctored exams (only an option if online proctoring is allowed by your Board of Examiners).

If you must test factual knowledge in a closed-book setting, consider using oral exams, or online proctoring (last resort).

Change the exam into an open book exam with open-ended questions (i.e. no multiple choice, multiple select, true/false, etc.). This implies that you cannot ask remember level questions. This should be constructively aligned with the learning activities and learning objectives.
See here for more information on how to construct open-ended questions
Click to see why you cannot use multiple-choice questions in a remote exam.

Answers to open-ended questions, especially those that require longer answers at higher levels of Bloom (click here for the TU Delft adaptation of Bloom’s taxonomy for engineering education), are not straightforward to share amongst peers. Furthermore, similarities in answers to open questions can be used to detect fraud.

If you want to know why closed-ended questions (multiple choice, yes/no, true/false, multiple select, etc.) are sensitive to fraud, and if there are exceptions read the text below.

Unless you are using a question bank (see below) which will render each student with a unique exam, you are discouraged strongly from using multiple-choice questions (mcqs). The same holds for true/false questions and other closed-ended questions.

Why shouldn’t I use (proctored) multiple-choice questions?
Multiple-choice questions are more sensitive to fraud, since it is relatively simple to communicate the answers to other students.

Why shouldn’t I use (proctored) true/false questions?
True/false questions are not good from an educational point of view, since students will start looking for an error in any question that they think is ‘true’. Students tend to overthink especially true statements and will continuously think that they have overlooked a detail that made the statement false. Uncertainty may diminish their performance.

What if I randomize the order of the answer options?
Changing the order of the options (answers) does not help, since students can still communicate ‘the answer that begins with/ends on/is the shortest/…’.

What if I ask them the same question at the same time?
Students can still communicate the answers.

What if I don’t allow them to go back to the previous question?
In this case, students will perform worse than on a standard exam, since they will either waste too much time on that question, or will skip the answer and feel sad that they just lost a point, and maybe remember the answer at a later time and will be terribly frustrated that they can’t go back to the previous question. The grade will not reflect how well they master the learning objectives and will be lower.

What if I randomize the order of the questions?
Changing the order of the questions will create illogical question orders for some students, and logical question orders for others. You should at least keep related questions together and if there is a logical order within a subject/learning objective, keep that order intact. Students can still communicate answers.

What if I ask each student a unique set of questions from a database of questions?
That is possible, but it is a lot of work to create that database (‘question bank’) and it must contain good quality questions that have proven to discriminate between good-performing and not-so-good-performing students (p-value and Rir-value should be known). This is normally done by analyzing (test result analysis) how well these or similar questions perform on previous, regular exams. The reason that you need ‘proven-quality’ questions is that you cannot do a test result analysis to change the scoring for some questions (i.e. giving all students full points since there was something wrong with the question).

What about multi-select questions? (These are multiple-choice questions where you can select each individual option and often don’t indicate how many options should be selected)
It is really difficult to develop good multi-select questions, and they can only be used in some cases. See below for more information on when you can use them and how.

Multi-select questions

These questions consist of a number of true/false questions and are prone to cheating. True/false questions are rather difficult to develop without giving students the feeling that they might have overlooked something and that somewhere in the statement, you put a clue that the statement was incorrect, after all.

Furthermore, if you are using Möbius, the grading is very untransparent: incorrectly selecting an option is punished harder than forgetting to select an option. This is not clear to lecturers, nor communicated to students. Therefore, scores are relatively low. Our experience is that the scores for multi-select questions (usually) do not correlate with the grade of the student.
The reason is that it is very difficult to ask good quality multi-select questions. In case of wordy questions, the main point is to ask a single question that can be answered without looking at the options.

When to use multi-select: The only type of question which is suitable for multi-select, is a question like ‘Which of the three geometric structures below is/are topologically identical to this structure?’ (insert a picture of a structure, and then 3 pictures of structures that are undeniably similar or dissimilar). In this case, each option should be indisputably correct or incorrect and students will not have the feeling that you are trying to trick them.

Be transparent about the scores: Be transparent in how students can earn points, and figure out how Möbius or Brightspace assigns/deducts points from the score for each incorrectly chosen or omitted option.

In case of multiple choice with 3 alternatives (3 is the preferred number of alternatives for multiple choice questions), each exam should consist of ~54 questions in order for the grade to be reliable. This number of questions is considerably higher than for open questions.

What about guessing? Do I adjust the grade?
Students can earn points by randomly guessing the correct answer. You will need to take guessing into account when calculating the grade from the score. Be transparent about this to your students and communicate this before, during (cover page) and after the test. More information can be found in the reader of UTQ module ASSESS.

How do I analyze the results to check for problems with questions/answers? How do I adjust the grading if test result analysis shows bad results?
Do a test result analysis to assess the quality of the individual questions and use the information to change the scoring of individual questions. More information can be found in the reader of UTQ module ASSESS. In Brightspace quizzes and Möbius, most relevant information is available in the test statistics.

In order to make it more difficult for students to share answers to questions (specifically answers to knowledge questions), you can give each student different exam questions, chosen from a question pool with interchangeable questions. This can be done by using different Exam versions or creating Unique individual exams from question banks:

  1. Exam versions: Divide the students into groups and give each group a different version of the exam. The exam questions are different for each group, and the same within a group.
    • Pro: easy to set up.
    • Con: if students find out in which group they are, they can communicate within the group.
    • Work-around: if the exam is divided into parts, you can change the grouping for the second part compared to the first grouping.
      You can keep the first questions in each exam part the same, so it is harder and more stressful for them to find out what group they belong to.
  2. Question pools: Give each student a unique exam, by drawing interchangeable questions from question pools. Each question pool contains questions that are interchangeable in terms of learning objective/topic and difficulty.
    • Pro: unpredictable questions, easy to set-up in Brightspace quizzes
    • Con: test result analysis only usable if you have large numbers of students

In both cases, you need interchangeable questions, which will take you more time to develop than in case of a traditional exam.

What does an exam with question pools look like?
Per learning objective or topic, you will formulate a number of questions at the same levels of difficulty and of the same question type. This pool of interchangeable questions is called a question pool.
Examples of interchangeable questions in the same question pool:

  1. Fill in the blanks, automatically graded using regular expressions: Naming parts of a machine (if the answer can be copied from a book, this is only possible for proctored exams). The machine is different for each question.
  2. Short answer, automatically graded using regular expressions: Writing out the applicable formula for a situation shown in a figure. The situation is different for each question.
  3. Arithmetic question, automatically graded (all or nothing): Calculate the force on a beam in a construction. The construction or the beam is different for each question.
    For each student, a unique exam will be formed with randomly drawn questions from the question pools.

Example 1

  • LO1: 3 question pools of unrelated questions
    • Pool 1a: low difficulty: matching question
    • Pool 1b: low question: short open question
    • Pool 1c: medium difficulty: arithmetic question
  • LO2: 3 question pools of unrelated questions
    • Pool 2a: low difficulty: open question, giving an explanation
    • Pool 2b: medium difficulty: arithmetic question
    • Pool 2c: medium difficulty: open question, analyzing a problem
  • LO3: 2 question pools of unrelated questions
    • Pool 3a: medium difficulty: arithmetic question, analyzing
    • Pool 3b: difficult question: analyzing data and drawing a conclusion

Example 2 (solving engineering problems)

  • LO1: 1 question pool that comprises questions with 5 subquestions each. The subquestions are of increasing difficulty.
  • LO2: 1 question pool that comprises questions with 4 subquestions each. The subquestions are of increasing difficulty.

Can I change the order of the questions?
The order of the exam questions should be logical, in order to enable students to perform optimally. Keep questions on the same topic/learning objective together.

Ask students the same, numerical questions, but with different numbers, so they cannot exchange the numbers.

Parametrization is used in numerical questions. For each question, all students use different numbers, chosen from a range that you determine. Therefore, the outcomes are different and it is not possible to commit fraud by sharing answers. In order to help each other, they would have to share the calculation steps, which is more cumbersome. Parametrization is possible in Brightspace quizzes (arithmetic question types), in Möbius, and in Grasple (available for math service education only).

If you want to use parametrization in Brightspace Assignment, you could determine the value they should work with based on a figure in their student number. You can change the figure that you base the values on for each (larger) assignment, to prevent grouping. Example: use the 3rd figure of the student number in question 1, the last in question 2 and the second in question 3.

3rd figure in your student number: 0 1 2 3 4 5 6 7 8 9
Value for x:

Value for y:

x=1

y=6

x=3

y=6

x=2

y=4

x=4

y=6

x=3

y=4

x=4

y=3

x=2

y=3

x=1

y=5

x=2

y=4

x=3

y=5

Be aware that students might unintentionally use the incorrect values. Try to make sure that the values lead to equally difficult calculations, and have students practice with this in order to reduce the stress during the exam of seeing this system for the first time.

Procedural measures

Next to fraud prevention measures in the development of exam questions, you can take the following procedural measures:

Make students promise that they will only hand in their own work and that they will not use help or unauthorized tools, nor will help other students. The promise can be made by having students typing over the honor’s pledge, or by reading it aloud at the start of an oral exam. In case of a written exam, this can be done a day before taking the exam.

We trust the integrity of the student. During your course, ask them to read the TU Delft code of conduct and discuss that you expect that they will adhere to the code. Indicate that you will ask them to do an honor pledge and what it will read. Inform them whether they will be asked to do the pledge before, at the start of, and/or at the end of their assessment. Students can either copy or vocalize the honor pledge.

You can change the pledge to make it more applicable for your assessment. Here are two examples:

  1. Online exam:
    “I promise that I will not use unauthorized help from people or other sources during my exam. I will create the answers on my own and I will create them only during the allocated exam time slots. I will not provide help to other students during their exam.”
  2. Timed take-home exam:
    “I promise that I have not used unauthorized help from people or other sources for completing my exam. I created the submitted answers all by myself during the time slot that was allocated for that specific exam part. I will not provide nor have I provided help to other students during their exam.”

For oral exams, students can promise that they will not receive questions from students who took the exam earlier, nor provide questions to the students who will take the exam later.

For written remote exams, the honor’s pledge could also be administered one day before the actual examination, by for example typing the text of a pledge in a Brightspace Quiz short answer and grade it automatically (students can have another go if they make a spelling mistake).

For oral exams, you can do it at the start of the recording (if applicable).

Complementary oral check for randomly sampled students

Contact a random 10% (or more) of the students immediately after the exam and ask them to explain a couple of answers to confirm that they authored the work. Make sure that you choose these students totally randomly, or by using selection criteria that are clearly unbiased towards specific groups of students. Let the students know what the timeslot is in which the oral check takes place, to prevent unnecessary waiting. Furthermore, take the Security and Privacy of Oral Examinations into account during oral authenticity checks.

In case of group projects or group assignments, let the group describe who contributed what, for example in the report. Provide them with a tool (Buddycheck) to stimulate them to give each other intermediate peer feedback on contribution, especially in larger projects. Make the group small enough so that everybody can contribute and that each contribution will be valued by their peers.

Most examiners will use the complementary post-exam oral check as an anti-fraud measure. It is important to mention that this is not a grade-determining part of the examination, it is only applied to check if the student has been honest in submitting his/her work. Only a sample of the students (e.g. 5-20%) will be selected to do the online complementary oral check.

In case you do a complementary oral check on a sample of your student population, please consider the following:

  • We recommend to do the check shortly after the exam has finished and before you have graded the exams.
  • Preferably pick the students totally random (so not the first 20% in alphabetical order, but for example based on a randomly picked last number of their student number).
  • If you decide to use an algorithm to select students, make sure to make the selection criteria explicit to prevent bias.

What to ask:

  • Ask for an explanation of some of their answers.

The checks can be recorded but should only be stored if there is a suspicion of fraud. The recording and storing should be done in a similar way as described in Security and Privacy Guidelines for Oral Exams.

In case you come across irregular results while you are scoring the assignment/exam and suspect fraud, please follow the regular processes and report this to the Board of Examiners and the involved student(s).

  1. Informing students
  2. Timing
    1. Preparing for handwritten exam questions takes about 5 minutes per question due to readability issues.
    2. Doing an oral check takes about 10 minutes, unless you run into bad cases.
  3. Identity check
    1. Inform students to keep their student ID ready.
    2. Check the students’ identity before you start the oral check or oral exam.
    3. Do not record campus cards or other proofs of identity.
  4. Questioning
    1. Ask for an explanation of some of their answers to check whether it is plausible that their work is their own.
    2. Be clear about whether or not students will be allowed look at their answers and/or drafts or not during the oral check.
    3. If students are allowed draft paper during the exam, ask them to write on clean sheets and inform them that you may ask them to show these draft papers during the oral check.
  5. Recordings
    1. Do not forget to delete all recordings two months after grading, unless for students who filed complaints. Delete their recordings two months after the procedure has been finished.
  6. Tool and TA support
    1. Use a tool with a main room and break-out rooms, like Virtual Classroom (Bongo), or a tool with waiting rooms.
    2. Have TAs invite the (random) students, put them in the waiting room, help them with their audio and video, check their identity, and move them to your break-out room when you are available for the oral check.
    3. Goal: to diminish start-up time.

Split the exam into 2-4 consecutive parts (30-90 minutes per part). Each exam part is visible during a timeslot and needs to be finished before the end of that timeslot. This diminishes the extent to which answers can be exchanged. You could schedule breaks in between. Please note that students tend to become very stressed by the intermediate deadlines, which diminishes their performance and the reliability of their grade. Therefore, make sure that the time slots are long enough for students to get into a flow of concentration. Preferably give them an opportunity to practice with similar timeslots in a practice exam, use long time-locks and as few as possible. Make sure that the length of each timeslot is realistic for students in exam conditions. Provide students who are entitled to extra time with correctly elongated timeslots.

Tip: Make the examination available only during the examination time-slot for the students who subscribed for the exams. If different groups have different exams, make the exam available only to the correct group. Close the exam after the time-slot (due date) plus a short, extra time window (grace period, end date).  This flags all exams that were submitted late in Brightspace, but it is possible for students to submit their work. This page provides more information on creating assignments for exams.

In case of Brightspace assignments that are written digitally, use the built-in plagiarism check in Brightspace (Ouriginal) and open each similarity report to check for larger matches in the student’s text.

In case of Brightspace Quizzes or Möbius, you have to download the students’ answers manually and look for similarities using a spreadsheet programme manually.

When you use for example Brightspace assignments, you can manually check whether students had the same handwriting as in a previous assignment, whether they copied the hand-written notes from a peer, or whether they seem to have copied texts from peers. Ask your students to keep the original handwritten papers, in case of legibility issues.

This is the only form of online video-surveillance that is allowed, and it should comply with the TU Delft Online Proctored Examination Regulation. Online proctoring is only available for digital knowledge exams that need to be taken as closed-book exams, in case it is not possible to change the exam into an oral examination due to student numbers.

  • Permission of the Board of Examiners: Your Board of Examiners needs to give explicit permission to use online proctoring for each exam. Online proctoring may only be used as a last resort, and the Board of Examiners assesses whether this is the case. Some Boards of Examiners have indicated that they will never give permission for online proctored examinations.
  • Online Proctored Examination Regulation: If you use online proctoring for your exam, you need to adhere to the Online Proctored Examination Regulations.
  • Online proctoring only option for video surveillance:  If you need to use video surveillance, you are only allowed to use online proctoring via Digital Exams, because it ensures that recorded data will be stored, processed and destroyed according to privacy regulations.
  • Availability: Online proctoring is in principle only available for knowledge question exams that are administered as digital exams. The reason why exams need to be digital is that the camera can only record the student’s face and not their handwriting, since students need to read the assignments from the screen. This implies that the camera faces their heads, not hands.
  • Maximum duration: The maximum duration of a proctored exam is 90 minutes. After 90 minutes, students need a toilet break, and the occurrence of technical issues increases.
  • Maximum number of students: The maximum number of students per group is increased to 150.
  • Available assessment tools with proctoring: Currently, proctoring is only available in combination with digital assessment tools Möbius and Grasple using the tool RPNow. Grasple is only available for mathematics (service education).
  • Practice test: Have all students do a practice exam a couple of days before the exam, to detect technical issues and procedural issues, and have students familiarize themselves with the tools.
  • Costs: Online proctoring is a paid service with costs ranging between 10-15 euros per student in the exam. The costs of a proctored exam are for the faculty.
  • Click to see why you should avoid using multiple-choice questions in an online proctored remote exam, and if you use them, how you should do this appropriately.

For oral exams and (project) presentations, you can do an identity check using the student’s campus card (do not record this!).

For exams in Brightspace Quizzes or Assignments, students need to login with their netID. It is not allowed to share the login credentials with other people.

Contact us

Please send us an email and we'll get back to you.

Sending

Log in with your credentials

Forgot your details?