Top Banner
RELTA in Practice
31
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Test Assessment and Implementation Rationale

RELTA in Practice

Page 2: Test Assessment and Implementation Rationale

•Test administration

•Scoring processes

- section scores and weighting

- rating processes

•Generation of results

•Test maintenance: reliability

•Compromises: theory vs practice

Overview

Page 3: Test Assessment and Implementation Rationale

Recap: Structure

Listening Speaking

Page 4: Test Assessment and Implementation Rationale

Recap: Delivery

Page 5: Test Assessment and Implementation Rationale

•On-site delivery •Separation of delivery and rating processes•Quality control: score reliability, training, data collation and storage•Minimise workload and human error•Maximise test security

Administration requirements

Page 6: Test Assessment and Implementation Rationale

Administration: Roles

•Administrators - test centre

•Examiners = interlocutors - test centre

•Raters x2 (or 3) – anywhere (local or RMIT)

•RMIT Operations

Page 7: Test Assessment and Implementation Rationale

Administration

RELTA Examining

RELTA Rating

Issuing of results

Test centreroles

RMITroles

Local administration Local examining

RELTA Examining

RELTA Rating

Issuing of results

RMIT delivery

RELTA Examining

RELTA Rating

Issuing of results

Page 8: Test Assessment and Implementation Rationale

Test versions dispatched

RELTA administere

d

Speech files rated (x2 raters)

Speech files uploaded to server

Rater 1 and Rater 2 scores combined and

checked

Speech file dispatched for independent rating

Results issued: Test Centre Regulator

Candidates

Administration

Page 9: Test Assessment and Implementation Rationale

Administration: Processes

1. Register candidates

2. Administer RELTA Listening – in groups

3. Listening tests marked – database

4. Examiner delivers RELTA Speaking

5. Speaking performance recorded

6. Uploaded to RMIT server

7. Raters (1 and 2) login to database and enter scores

8. Rater scores compared for reliability – 3rd rater

9. Results calculated

10. Certificates generated

Page 10: Test Assessment and Implementation Rationale

Administration: SRMS

Scoring Record Management System (SRMS):

Developed to manage administration:• Distributes candidate speaking files to raters• Captures and rater scores• Allows for remote and independent rating• Identifies rater discrepancies • Generates results• Produces certificates• Stores candidate details and speaking files centrally

Online access for: • Administrators (registering of candidates, accessing

results)• Raters (enter scores for RELTA speaking)

Page 11: Test Assessment and Implementation Rationale

ScoringListening(comprehension)

Marks

Speaking

Weighting

Section 1 10 20%

Section 2 10 35%

Section 3 10 45%

Page 12: Test Assessment and Implementation Rationale

Pronunciation Structure Vocabulary Fluency Comprehension Interactions

Section 1

20% 4 4 5 5 N/A 5Section 2

35%

Section 3

45%

Weighted average score

OVERALL SCORES

ICAO Score(lowest overall score reported)

4

Scoring: Speaking

Page 13: Test Assessment and Implementation Rationale

Pronunciation Structure Vocabulary Fluency Comprehension Interactions

Section 1

20% 4 4 5 5 N/A 5Section 2

35% 4 4 4 4 N/A 4

Section 3

45%

Weighted average score

OVERALL SCORES

ICAO Score(lowest overall score reported)

4

Page 14: Test Assessment and Implementation Rationale

Pronunciation Structure Vocabulary Fluency Comprehension Interactions

Section 1

20% 4 4 5 5 N/A 5Section 2

35% 4 4 4 4 N/A 4

Section 3

45% 4 4 4 5 N/A 5

Weighted average score

OVERALL SCORES

ICAO Score(lowest overall score reported)

4

Page 15: Test Assessment and Implementation Rationale

Pronunciation Structure Vocabulary Fluency Comprehension Interactions

Section 1

20% 4 4 5 5 N/A 5Section 2

35% 4 4 4 4 N/A 4

Section 3

45% 4 4 4 5 N/A 5

Weighted average score

4 4 4.2 4.65 22/30 4.65

OVERALL SCORES

ICAO Score(lowest overall score reported)

4

Page 16: Test Assessment and Implementation Rationale

Pronunciation Structure Vocabulary Fluency Comprehension Interactions

Section 1

20%4 5 4 5 5 4 5 5 N/A 5 5

Section 2

35%4 4 4 4 4 4 4 5 N/A 4 4

Section 3

45%4 4 4 4 4 4 5 4 N/A 4 5

Weighted average score

4 4.2 4 4.2 4.2 4 4.65 4.55 22/30 4.2 4.65

Combined overall scores

OVERALL Score (Score reported for ICAO purposes)

Page 17: Test Assessment and Implementation Rationale

Pronunciation Structure Vocabulary Fluency Comprehension Interactions

Section 1

20%4 5 4 5 5 4 5 5 N/A 5 5

Section 2

35%4 4 4 4 4 4 4 5 N/A 4 4

Section 3

45%4 4 4 4 4 4 5 4 N/A 4 5

Weighted average score

4 4.2

4.1

4 4.2

4.1

4.2 4

4.1

4.65 4.55

4.6

22/30 4.2 4.65

4.425

Combined overall scores

OVERALL Score (Score reported for ICAO purposes)

Page 18: Test Assessment and Implementation Rationale

Pronunciation Structure Vocabulary Fluency Comprehension Interactions

Section 1

20%4 5 4 5 5 4 5 5 N/A 5 5

Section 2

35%4 4 4 4 4 4 4 5 N/A 4 4

Section 3

45%4 4 4 4 4 4 5 4 N/A 4 5

Weighted average score

4 4.2

4.1

4 4.2

4.1

4.2 4

4.1

4.65 4.55

4.6

22/30 4.2 4.65

4.425

Combined overall scores 4 4 4 4 5 4OVERALL Score (Score reported for ICAO purposes) 4

Page 19: Test Assessment and Implementation Rationale

Scoring: Rater 1 interface

Enter scores for Sections 1, 2 and 3

Download rating scripts

Listen to candidate sound file

Page 20: Test Assessment and Implementation Rationale

Scoring: Rater 2 interface

Enter scores for Sections 1, 2 and 3

Download rating scripts

Listen to candidate sound file

Page 21: Test Assessment and Implementation Rationale

Admin interface

Calculates overall scores

Determines ICAO Level

Imports scores from R1 and R2

Imports Listening score

Identifies rater discrepancies

Page 22: Test Assessment and Implementation Rationale

Data collation

______________

Page 23: Test Assessment and Implementation Rationale

Rating discrepancies

Page 24: Test Assessment and Implementation Rationale

SRMS: Rating discrepancy

Page 25: Test Assessment and Implementation Rationale

Inter-rater reliability

Page 26: Test Assessment and Implementation Rationale

Inter-rater reliability

Page 27: Test Assessment and Implementation Rationale

Rater re-accreditation

Page 28: Test Assessment and Implementation Rationale

Rater accreditation

Page 29: Test Assessment and Implementation Rationale

Examiner accreditation

Page 30: Test Assessment and Implementation Rationale

Manuals

Page 31: Test Assessment and Implementation Rationale

Summary

Testing for ICAO compliance requires:

Test instruments that are well designed and can be implemented effectively