Presented at the 2005 CLEAR Annual Conference September 15-17 Phoenix, Arizona “Sterling Security Practices” Welcome and Introductions Terry Ausman ACT, Inc
Dec 24, 2015
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
“Sterling Security Practices”
Welcome and Introductions
Terry AusmanACT, Inc
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Setting the Stage for the Information Age
• Six out of every ten jobs require some postsecondary education and training
Carnevale and Desrochers, 2003
• By 2012, the number of jobs requiring advanced skills will grow at twice the rate of those requiring only basic skills
US department of Labor 2000; Hecker, 2004
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Security
Protect the TestProtect the EnvironmentBe Nice
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Security as a System
Development
Delivery
Data Monitoring
Defense
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Speaker Contact Information
Terry Ausman
ACT, Inc.
101 ACT Drive, Iowa City, IA 52243
319-341-2523
www.act.org
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
“Sterling Test Security Practices -Test Development”
Mark G. Christensen, Ph.D.National Board of Chiropractic Examiners
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Facility Security
• The building is locked at all times.• Building access is monitored by camera and staff.• Guests are escorted by staff members.• Camera monitors are located throughout the building.• At night – internal doors & cabinets are locked and a
security system is activated.• A published list of invitees’ names is provided our staff
prior to each Test Committee meeting. • All Test Committee members are issued name badges.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Printing Security
Our Printer:
• Signs an NBCE confidentiality agreement.
• Personally picks up and returns materials directly to NBCE.
• Has a vault on the printing premises.
• Brings overuns and extra pages to NBCE for security shredding at our facility.
• Prints more than one form of an exam.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Item Security
• Item writers sign confidentiality agreements.
• All items are extensively re-written internally.
• All exams have multiple authors.
• Materials are stored in a secure room with access limited to those who “need to know.”
• Internal and external computers systems are completely separate.
• Computer access to the item pool is limited.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Test Committee Members
• Are recommended by their college academic officer or state licensing board member.
• Sign confidentiality agreements.
• Are known by NBCE staff.
• Cannot be examinees for exams they selected.
• Are monitored for “teaching to the test.”
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Test Committee Process
Test Committee Moderators:• Meet with staff to discuss their assignments,
responsibilities, and security procedures.• Are subject-matter experts and are known by
NBCE staff members.• Are under contract with NBCE and have
signed confidentiality agreements.• Are present at all times to observe committee
members and oversee item selection.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Test Committee Process, cont.
During the Meeting:
• All materials are carefully monitored.
• Highly confidential materials are labeled with the Test Committee member’s name prior to the committee meeting. They are collected and accounted for immediately after use.
• At night, the materials are returned to a vault.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Exam Shipping
Trunks are:• Waterproof, air-tight, secured with a combination lock.• Packed in a secure area.• Overnight, security-expressed to and from the test
administrator.• Sent with inventory shipping sheets to account for the
materials received and returned.
Completed answer sheets:• Are overnight-expressed back to NBCE from the test
administrator in a separate container.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Final Thoughts
• Create a security-conscious culture.
• Develop standard, easy-to-follow procedures.
• Hire experienced, capable, responsible people.
• Establish policies that encourage loyalty and retention of staff and test administrators.
• Make security a part of every job description.
• Use confidentiality agreements.
• Review security procedures periodically.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Speaker Contact Information
Mark G. Christensen, Ph.D.
National Board of Chiropractic Examiners
901 54th Avenue, Greeley, CO 80364
Phone: 970-356-9100 Fax: 970-395-0021
www.nbce.org
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
“Sterling Test Security Practices-Delivery”
Mark PooleDirector of Test Security
Pearson VUE
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Securing Test Delivery
• Four interlocking elements:– Test sites– Processes and systems– Training– Audit
• Might also be viewed as:– Infrastructure– Policy– Compliance
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
First: Understand the Threats
• Unauthorized access or disclosure– Item theft– Item ‘sharing’
• Compromised integrity– Giving or receiving assistance– Confederates/proxies
• Denial of access– Disruption of the examination process
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Site Controls
• Physical site security
• Restricted/separate spaces– Admission/check-in– Testing equipment/supplies– Personal belongings– Test delivery– Proctors
• Surveillance capabilities
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process Controls
• Staffing
• Admissions– Authorization to test– Positive identification– Biometrics (in some programs)
• Proctoring– Continuous monitoring– Incident response
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Site and System
• Example of an integrated physical and system design for test security…– Dedicated staff– Process flow– Biometrics– Air lock– Proctor station– Digital surveillance– Real-time incident response
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Example
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Training
• Policy development & review• Documentation• Proctor training• Proctor assessment & certification• Conduct agreement• Security awareness• Positive feedback• Retraining
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Audit/Compliance
• Analysis– Information gathering (incidents, surveys)– Data forensics
• Inspection– Log audits– Monitoring– Site inspections– Integrity shopping– Investigation
• Corrective action
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Final Thoughts
• There are no ‘security requirements’– Only business requirements– Risk management
• Base controls on standards– Sarbanes-Oxley, SEC, EU DP (regulatory)– ISO 17799 (information security)– BS 7988 (computer based assessments)– ITC and ATP guidelines for testing
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Speaker Contact Information
Mark Poole, CISSP
Pearson VUE
5601 Green Valley Dr, Bloomington, MN 55437
Phone: (952) 681-3982 Fax: (952) 681-3975
www.pearsonvue.com
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Data Forensics: Identifying Clues in Your Test Data
Cyndy Fitzgerald, Ph.D.
Caveon Test Security
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Data Forensics
• What’s the need?
• What’s involved?
• Where can you start?
• Examples
• Forensic indicators
• Forensic lenses
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
What’s the need?
• Detect cheating• Assess prevalence of cheating
– Degree– Scope
• Determine program cheating tolerance
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
What’s involved?
• Definition– Data forensics are ways of analyzing data with
an eye toward detecting test fraud.
• Process– Find the compromise (sort and sift)– Identify the type of compromise– Assess the degree of compromise
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Where you might start?
• Person fit statistics
• DIF or drift statistics
• Other common statistics in the literature
• Result: Ad hoc forensic methodology– Most current statistics are not designed to meet
the need– No clear mechanism to assess risk
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
What else is needed?
• Need integrated statistical models
• Need a comprehensive detection methodology
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Forensic Indicators
• Patterns of cheating• Retake violators• Collusion• Patterns of piracy
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Examples from real data
• Approximately 10% of tests in this program have demonstrated aberrance
• 10 exams in program
• Analysis is cut different ways
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Suspected Test Fraud Overview
2,442 test records, or 9% of the total of all combined test records, contain evidence of suspected test fraud.
Other24428.8%
Retake Violations1.17%
Suspected Piracy0.4%
Volatile Retakes0.06%
Suspected Collusion
4%
Suspected Cheating
5%
Total Tests2801391.2%
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Incidents of Fraud By Test
0
20
40
60
80
100
120
140
XYZ-1001
XYZ-1002
XYZ-1003
XYZ-1004
XYZ-1005
XYZ-1006
XYZ-1007
XYZ-1008
XYZ-1009
XYZ-1010
Per
1,0
00 t
ests
Retake Violations Volatile Retakes Cheating Piracy Collusion
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Grade X
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
231
235
239
243
247
251
255
259
263
267
271
275
279
283
287
291
Scale Score
Fre
qu
en
cy
08-R w/o Collusion 08-R Collusion
Collusion
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Data Forensic Lenses
• When– Looking at test performance change due to
events and occurrences
• Where– Looking at different forms of test fraud at
different sites
• Who– Looking at behavior of individuals that
relates to test fraud
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Summary
Data Forensics – The analyses look at:
• Tests over time• Places• People
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Speaker Contact Information
Cyndy Fitzgerald, Ph.D.Senior Director and FounderCaveon Test SecurityP.O. Box 680Maple Valley, WA 98038Cell: 425-922-3655Website is at www.caveon.com
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
“Detection: Building a Case”
Paul MacDonald, PhDAssessment Strategies Inc.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Cheating Behaviour
• Definition
– When one candidate has copied from or shared responses with another candidate
– Evidenced by responses in common
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Detection of Cheating
• What methods can be used?– Invigilation– Candidate reporting– Statistical analysis
• Computational demands of statistical analyses are now met by contemporary computers
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Commercial Software
• Scrutiny! by Advanced Psychometrics, Inc.– Noted weakness by failing to consider
information
• Integrity by Castle Rock Research– Uses five different procedures
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Research
• First published research by Bird (1927) and Crawford (1930)
• Most influential work by Angoff (1974)
• Frary, Tideman, & Watts (1977)
• Hanson, Harris, & Brennan (1987)
• Bellezza & Bellezza (1991)
• IRT model used in Wollack (1997)
• In-house work by ETS (K-index) and ACT (AJ and AS statistics)
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Statistical Models
• Empirical Models– Need to create a distribution of values to
empirically derived probabilities
• Chance Models– Uses a known distribution (e.g., normal)
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Example
• Unit of comparison– focus on pairs of candidates
e.g., with 1,500 candidates, a total of 1,124,250 pairs of candidates can be examined
– potential cheaters are those pairs of candidates whose pattern of responses are too similar to happen by chance alone
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Index B (Empirical Model)
Equation:
Where:
Qab is the number of errors in common
Q is the average number of errors in common
SQ is the standard deviation of errors in common
Parameters are calculated for all pairs in a stratum
Stratum – pairs with similar performance
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Candidates
A hypothetical examination with 100 items
Candidate A Candidate B68 correct 69 correct32 wrong 31 wrong
common errors25
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
0
5000
10000
15000
20000
25000
30000
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Errors in Common
Fre
qu
en
cy
Graphical Presentation
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Statistical Findings
Index B
ZIndex B = 8.44
Probability
0.01
0.001
Must be derived empirically
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Calculating Type I Error Rate
• Plot obtained values from candidate pairs for whom cheating was not possible
• These pairs represent the null distribution
• Select appropriate cut-off value
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Plotting Null Data
-4.0 -3.0 -2.0 -1.0 0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0
0
200000
400000
600000
800000
1000000
1200000
Fre
qu
ency
Obtained Z-score
Within Group
Parent Group
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Index g2 (Chance Model)
Equation:
Where:
Cab - number of responses in common
pi - expected number of responses in common
pi(1-pi) - expected variance of response in common
Index is calculated for each pair separately
ZC p
p pIndexg
ab i
i i
2
1
( )
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Candidates
A hypothetical examination with 100 items
Candidate A Candidate B68 correct 69 correct32 wrong 31 wrong
common responses90
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Graphical Presentation
0
2000
4000
6000
8000
10000
12000
14000
16000
39.1 42.0 44.9 47.8 50.7 53.6 56.5 59.4 62.3 65.2 68.1 71.0 73.9 76.8 79.7 82.6 85.5 88.4
Expected Common Responses
Fre
qu
en
cy
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Statistical Findings
Index g2
ZIndex g2 = 8.69
Probability
0.01
0.001
0.0000000000000000018
Less than 1 in a million billion
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Contrast Two Booklets
0
2000
4000
6000
8000
10000
12000
14000
16000
18000
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
Number of Errors in Common
Fre
qu
ency
AM
PM
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Additional Information
• Seating arrangements - distances and angles
• Invigilator reports - eye witnesses
• Answer sheets - eraser patterns
• Test booklets - notes, answer strings
• Calculation sheets - matches answers
• Opportunity for candidates to respond - confessions
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Conclusion
• Evidence of test administration irregularities
• Candidates A and B obtained achievement similarities not attributable to chance
• Evidence that scores are not valid
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Writing Centre Evaluation
• Examined all pairs of candidates within a writing centre
• For each candidate, determined if they were included in a ‘suspicious’ pair
• Recorded the number of detected candidates with the writing centre
• Repeat process for each writing centre
• Compare results
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Writing Centres
Number of Candidates and PercentSuspicious Behaviour by Writing Centre
0
50
100
150
200
250
300
350
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Writing Centres
Num
ber o
f Can
dida
tes
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Per
cent
Sus
pici
ous
Number of Candidates
AM Administration
PM Administration
National Average
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Summary
• Computational demands can be met
• Statistical detection methods available
• Flexibility to examine candidates and writing centres
• Determine the method(s) most suitable for your examinations
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Speaker Contact Information
Paul MacDonald
Assessment Strategies Inc.
1400 Blair Place, Suite 210
Ottawa, ON K1J 9B8
(613) 237-0241 ext. 247
www.asinc.ca
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
“Sterling Test Security Practices-Defense”
Roger MeadeDirector of Global Security, Thomson Prometric
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Notification of Regulations
Examinee Agreement-• Prohibited to disclose exam content by any means• Prohibited to use unauthorized aids• Rules of conduct during exam
Exam Supervisor/Proctor Agreement-• No involvement in tutoring or coaching classes• Prohibited to disclose exam content by any means• Can not take the test themselves• Can not hold or pursue exam sponsor’s license or
certification themselves• Submit to background check
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Handling Misconduct
• Administrator to have co-witness
• Clear policy for warning or exam termination
• Standard format for documenting & reporting of incident
• Collection of evidence-Chain of custody:
– Document who accessed materials
– If paper, administrator to sign & date back of each sheet
– If equipment (digital camera, PDA, etc.), document make, model, serial #, delete any data recorded during exam
– Send to test sponsor or EDP via traceable method
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Determine examinee’s status
• Disciplinary hearing
• Prepare any witnesses prior to hearing
• Have witnesses & examinee present their account of events
• Present evidence
• Present investigative data
• Obtain signed affidavits from all parties involved
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Disciplinary Actions
• Forfeiture of exam & fees, invalidate results
• Prohibition of further testing for program
• Revoke current licenses or certifications
• Compliance dept. - termination
• If violation involves disclosure, reproduction, or transmission of items, pursue civil or criminal prosecution
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Digital Millennium Copyright Act (DMCA)
• Passed October 28, 1998• Supported by software & entertainment
industries• Requires ISPs to remove or block access to
websites that offer infringing material• ISPs can be held liable if they do not take
action after notification• Most ISPs have DMCA submission forms
to file complaints
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Additional Investigative Resources
• State-based investigators/compliance officers
• Federal Authorities- if issue crosses state lines or involves fraudulent ID
• Security consultants- Kroll, Pinkerton, etc.
• Be vigilant for press awareness
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Speaker Contact Information
Roger Meade, Director of Global SecurityThomson Prometric1000 Lancaster St.
Baltimore, MD 21202(443) [email protected]
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Disciplinary Process:Preparation for Action
John WickettAssessment Strategies Inc.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
• Create a sound disciplinary plan
• Create an environment that minimizes likelihood of event
• But, if necessary, execute the plan
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Overview
• If you have discovered a suspicious event, what do you do
• Action must be taken upon discovery, but taken calmly, coolly, and with measured steps
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Discovery• Investigate and verify• Inform candidate• Impose sanction ------------• Conduct appeal ------------• Court
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Discovery of suspicious event– Types of events
• cheating• item theft• collusion
– Sources of discovery• invigilator report• candidate informant• other informant• statistical analysis
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Investigate and verify facts– Remember that the individual(s) in questions
may actually be innocent of all wrongdoing– Treat all players with respect and ensure that
their privacy is respected– All reports must be verified, ideally with
physical or other supporting evidence– Escalate level of investigation only as data
increasingly supports need to pursue discipline
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Investigate and verify facts– Methods
• Contact invigilator
• Check seating arrangements and line of sight
• Review of video surveillance
• Statistical analysis
• Private investigator
• Police action
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Alert individual– If case is questionable, obtain legal opinion– Inform candidate, in writing, of Regulator’s
understanding of what happened, that an infraction appears to have been committed, and providing opportunity for the candidate to respond with alternative explanation
– Results are held until resolution
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Inform of sanction and appeal method– If response from candidate is not forthcoming
or not sufficient, inform them of the sanction to be taken
– They should receive a copy of the appeal process that they can choose to follow
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Inform of sanction and appeal method– Sanction options?
• Seek legal advice
• Not a pass
• Barred for life may be too extreme
• Simply waiting for the next administration may be too lenient
• Having to wait some time period before subsequent attempt is probably reasonable
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Internal Appeal Process– Used if candidate chooses to contest the
sanction– Basic approach is to set up a Board of Appeals,
ideally with one or more external members– Need direct legal counsel at this point– Board solicits submissions from Regulator and
Candidate and may or may not hold a hearing– Board’s decision is final
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Process
• Court– Used if candidate rejects the sanction approved
by the Board of Appeal, and if candidate chooses to pursue the case
– New set of rules and Regulator is no longer in charge of situation
– May need to take action to protect the security of test
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Enabling Factors
• Before the Event– Fully developed and transparent investigation
and appeal process– Inappropriate behaviour is explicitly defined– Prior agreement from candidate not to engage
in inappropriate behaviour
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Enabling Factors
Sample Candidate Declaration for Front of Examination Booklet
IMPORTANT NOTICEThis examination booklet and its contents, including the examination questions, are highly confidential and are the property of the [Regulatory Authority]. Candidates taking the examination are therefore prohibited from disclosing the contents of the examination booklet and must not, under any circumstances, share any of the information it contains with any person, except as authorized by the [Regulatory Authority]. Unauthorized production, reproduction or publication of the examination questions is also prohibited by copyright laws. In addition, the [Regulatory Authority] has implemented measures and statistical procedures to detect cheating (i.e., copying answers from another candidate; voluntarily or involuntarily providing answers to another candidate). Unauthorized disclosure of the contents of the examination booklet and any other form of cheating is unethical behaviour and shall result in sanctions. If the regulatory authority determines a candidate has cheated on the examination or . . .
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Enabling Factors
Sample Candidate Declaration for Front of Examination Booklet (continued)
. . . has attempted to remove examination materials from the testing room, the candidate is automatically assigned a fail result and the writing is counted as an examination writing. Other sanctions may be imposed and may extend to being denied access into the profession.
CANDIDATE DECLARATIONI acknowledge that I have read the above provisions regarding the disclosure, production, reproduction or publication of the examination booklet and its content, and cheating with respect to the examination. My signature on this examination booklet constitutes my agreement not to disclose, produce, reproduce or otherwise engage in the publication of the examination booklet and its content, unless authorized by the [Regulatory Authority], or to cheat with respect to the examination.
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Enabling Factors
• Before the Event– Release of information as to the type of
analyses and investigations that occur to detect inappropriate behaviour
– Release of information about other cases
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Enabling Factors
• During the Event– Monitoring of situation– Recording of detailed notes– Invigilator informs candidate that the behaviour
is inappropriate
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Enabling Factors
• After the Event– Execute on Disciplinary Process
• Never vary from established steps
– Collect any available eyewitness testimony, run statistical analyses
– Obtain physical evidence• Photos• Measurements• Marks on question books, answer sheets, scrap
paper
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Cases
• FPSC– Classic cheating with confession in many cases
and rare escalation to Board of Appeals– Used cases to prompt improvements to process
• MCC– Cheating decision upheld in court based on
statistical data
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Cases
• RCMP– Breakdown in adherence to Discipline Process
led to release of test
• 2 Health Care Regulators– Failed action for one due to missing policy . . .– . . . led to changes for all other provincial
regulators . . .– . . . which enabled successful action for another
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Impact of Failure to Pursue
• This is serious; these individuals are attacking the integrity of the profession and invalidating the hard work of the dedicated professionals who built the test
• Owe it to those candidates who honestly seek and obtain their license to stop those who would obtain that same license unethically and without the required level of competence
• Regulatory authority that does not pursue may not be fulfilling its mandate
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Impact of Failure to Pursue
• Risks– Public safety– Legal risks (employer, public)– Integrity of credential– Integrity of profession
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Close
• Ideally, there would be no need for disciplinary action, and every effort should be made to prevent opportunities for candidates to engage in unethical testing practices
• Some will, though, and when they do, engage the Disciplinary Process
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Speaker Contact Information
John Wickett
Director, Assessment Design
Assessment Strategies Inc.
Suite 210, 1400 Blair Place
Ottawa, ON
866-321-8378
www.asinc.ca
Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona
Questions