1
TheELPA21FieldTestImplementationReport
September2015
ELPA21 Field Test Implementation Report
2
ELPA21 Document Change History
Date Version Change Made by Description of Change
9/23/2015 1 n/a Original version
ELPA21 Field Test Implementation Report
3
TABLE OF CONTENTS
I. INTRODUCTION …………………………………………………………………………………………………………………………………….....….. 4 A. ELPA21 OVERVIEW B. PURPOSE AND SCOPE OF REPORT C. RELATED REPORTS AND DOCUMENTS
II. FIELD TEST AND TRIAL ………………………………………………………………………………………………………………………….……..... 5 A. PLATFORM AND SYSTEM TRIAL
i. PURPOSE AND TIMING ii. GOALS AND SCOPE iii. RECRUITMENT AND PARTICIPATION iv. SURVEY FEEDBACK v. RECOMMENDATIONS
B. FIELD TEST i. PURPOSE AND TIMING ii. ANALYSIS GOALS AND SCOPE iii. RECRUITMENT AND PARTICIPATION
C. TRAINING PROCESS i. TRAINING MODULES ii. FIELD TEST ADMINISTRATION AND TECHNICAL MANUALS iii. INTERACTIVE DEMOS iv. HARDWARE AND HEADSET SPECIFICATIONS
III. FIELD TEST SURVEY ………….…………………………………………………………………………..…………………………..……..…………. 12 A. PROCESS FOR DEVELOPING SURVEY QUESTIONS B. SURVEY PARTICIPANTS AND OUTREACH C. GENERAL QUALITATIVE FEEDBACK D. MAJOR AREAS OF FEEDBACK AND RECOMMENDATIONS FOR OPERATIONAL ADMINISTRATION
i. TRAINING PROCESSES AND MATERIALS ii. TEST PLATFORM AND TECHNICAL ISSUES iii. ACCESSIBILITY AND ACCOMMODATIONS iv. COMMUNICATIONS v. ITEM SPECIFIC ISSUES vi. HELPDESK
IV. CONCLUSION ………….…………………………………………………………………………………………………..…………………….………… 18 A. REFLECTION B. PLANNED SUPPORT FOR YEAR 1 C. RECOMMENDATIONS FOR STATES
V. ACKNOWLEDGEMENTS ………….……………………………………………………………………………………………………….…..….…… 19
VI. APPENDICES………….……………………………………………………………………………………………………….…..….………………….… 20 A. APPENDIX A: ADDITIONAL RESOURCES PROVIDED TO EDUCATORS, ADMINISTRATORS AND FAMILIES PRIOR TO THE
ELPA21 FIELD TEST: TRAINING AND TECHNICAL SUPPORT ………….………………………………………..……..…………….… 20 B. APPENDIX B: FIELD TEST ITEM BANK INVENTORY TABLE ………….…………………………………………..……..….………….… 21 C. APPENDIX C: HEADSET SPECIFICATIONS..……….………………………………………..……………….….…………………………... 22 D. APPENDIX D: HARDWARE SPECIFICATIONS ………….……………………………………………………..….…………………………. 23 E. APPENDIX E: FIELD TEST PLATFORM ISSUES TRACKER ………….……………………….……………………………..….…………… 24 F. APPENDIX F: FIELD TEST ITEM ISSUES TRACKER ………….…………………………………….……………………..….…………..… 37 G. APPENDIX G: PLATFORM AND SYSTEM TRIAL SURVEY QUESTIONS ………….……………………………………….…………….… 44 H. APPENDIX H: FIELD TEST SURVEY QUESTIONS ………….……………………………………………………………..…...…………….46
ELPA21 Field Test Implementation Report
4
I. INTRODUCTION
a) ELPA21 OVERVIEW
ELPA21 is a computer‐based assessment administered online to English language learners (ELLs) in grades K–12
to measure their emerging English language proficiency as they progress through school and work toward
achieving college and career readiness. Developed by the ELPA21 consortium, ELPA21 aligns with the English
Language Proficiency (ELP) Standards that define what English language skills ELLs should have at particular
grade levels to be successful in school.
ELPA21 assesses students’ English language proficiency levels and progress in four domains: reading, writing,
listening, and speaking. The six ELPA21 grade bands are K, 1, 2‐3, 4‐5, 6‐8, and 9‐12.
Member states of the ELPA21 consortium include Arkansas, Iowa, Kansas, Louisiana, Nebraska, Ohio, Oregon,
South Carolina, Washington, and West Virginia. These states will use ELPA21 in place of their previous English
language proficiency assessments starting in school year 2015‐2016.
The ELPA21 system includes:
● an annual summative assessment for each grade band for monitoring student progress, tracking
accountability, certifying program exit, and prompting instructional improvement.
● a screener assessment to provide information for ELL identification and placement.
b) PURPOSE AND SCOPE OF REPORT
The ELPA21 Field Test Implementation Report contains a summary of the development and implementation of
the ELPA21 Field Test, which took place in schools in eight states from February 2 through March 31, 2015. This
report documents the operational aspects of the Field Test; the technical report on Field Test item statistics will
be released separately.
This report also summarizes and responds to feedback collected during and after the Field Test from a number
of sources, including help desk calls, in‐person observations by ELPA21 personnel, and a post‐field‐test survey
that was made available to all participating educators. Prevalent trends and areas for improvement are noted,
and recommendations for implementation are offered.
ELPA21 states and their platform vendors are encouraged to read this report with specific focus on
recommendations for platform configuration and delivery specifications. These recommendations are noted in
italics with the icon.
c) RELATED REPORTS AND DOCUMENTS
Please refer to the following reports, which will be released later this year, for additional technical information
about the ELPA21 Field Test and the ELPA21 assessment system generally.
ELPA21 Field Test Implementation Report
5
Report Title Anticipated Date of Release
ELPA21 Assessment Framework October 2015
ELPA21 Implementation Manual October 2015
ELPA21 Field Test Technical Report December 2015
II. FIELD TEST AND TRIAL
A. PLATFORM AND SYSTEM TRIAL
i. PURPOSE AND TIMING
The Platform and System Trial was a small‐scale event that took place Jan. 6‐14, 2015. School‐based educators
and administrators from 361 schools simulated the student testing experience, evaluated the administration
guides, and provided feedback on the testing platform. Feedback from schools and districts during the Platform
and System Trial was then used to refine the Field Test platform.
ii. RECRUITMENT AND PARTICIPATION
Participation was voluntary, and schools were invited to register in October 2014. Questar Assessment Inc.
(Questar) worked collaboratively with Reingold Inc. and ELPA21 states to communicate with schools and districts
via email, as well as via ELPA21’s public‐facing website. An effort was made to ensure that participants in the
Platform and System Trial were able to test the platform on the full array of testing equipment, including
tablets. The trial was offered to participants as a way to familiarize themselves with the testing platform and
training materials prior to the opening of the Field Test window.
iii. GOALS AND SCOPE
During the Platform and System Trial, users logged into the Questar Administration site and simulated the setup
and training process using pilot versions of the Test Administrator’s Manual (TAM), Test Coordinator’s Manual
(TCM), Accessibility and Accommodations (AA) Manual, Training Modules, and Interactive Demos. Participants
practiced various administrative functions (e.g., adding students to the system, setting up classes, printing test
tickets) to evaluate the usefulness of the training materials and accessibility of the administrative platform.
Feedback from the Platform and System Trial was then used to inform adjustments to the Field Test platform
and the operational assessment system.
When reviewing the Test Administration Site, participants were asked to complete the following steps:
● Access and read through all online manuals, as well as other support materials.
● Upload student data files.
● View and create user accounts.
● Modify student’s personal needs profile (PNP) information.
● Schedule test sessions for a class of students.
When reviewing the Student Platform, participants were asked to perform specific user‐acceptance testing
including:
ELPA21 Field Test Implementation Report
6
● Check that the secure browser was indeed secure by attempting to access programs and features
outside of the test.
● Ensure that the test could be paused and reopened without the browser crashing.
● Turn on accommodations to make sure they were being administered correctly.
● Navigate through the test items to test their functionality.
After this review, participants completed the Platform and System Trial survey. The survey was closed on January
15, 2015.
iv. SURVEY FEEDBACK
Overall, respondents to the Platform and System Trial survey agreed that the Test Administration Manual was
easily understood (average rating 3.04 out of 4); that the instructions in the Test Coordinator’s Manual were
clear (average rating 2.92 out of 4); and that the Accessibility and Accommodations Manual provided sufficient
detail to implement the ELPA21 assessments (average rating 3.08 out of 4).
The respondents provided clear feedback for areas of improvement related to technical problems during the
testing process (55.3 percent of the respondents disagreed with the statement “I experienced no technical
problems during the testing process”) and this feedback was used to improve the training materials and
technical support provided during the Field Test.
Of the respondents who provided general feedback, 26.7 percent commented on the testing platform,
describing problems they encountered and suggesting ways to improve it. For example:
● “There are some delays in device responses in items that contain audio. The audio does not start as
quickly as what students are used to. It may be helpful to add a spinning icon or some other picture that
indicates whether the program is 'thinking'.”
● “Overall it was a fairly good experience, however the only problem was when I wanted to pause the
assessment, when I wanted to begin again I had to begin at the beginning of the test and not where I left
off.”
Other respondents commented about specific items (28.3 percent); provided general support and information
(16.7 percent); noted that computer literacy is required to take the assessments (11.7 percent); and provided
comments about training (5.0 percent), the help tab (3.3 percent), and the short timeframe (3.3 percent).
The full list of survey questions asked after the Platform and System Trial can be found in Appendix G.
v. RECOMMENDATIONS
Based on the Platform and System Trial survey results, Questar made the following recommendations which
were applied to the support materials provided to educators prior to the Field Test.
ELPA21 Field Test Implementation Report
7
Test Administrator’s Manual (TAM)
Place more emphasis on students’ use of the Interactive Demos (ID) prior to testing.
Better explain the differences between the ID and the field test (this also applies to the operational summative assessment) (i.e., student responses during the ID were not saved, the final review screen in
the ID did not have the same functionality).
Adjust/enhance directions within the TAM for K and 1 administrators.
Test Coordinator’s Manual (TCM)
Revise sections of the TCM that explain adding students, teachers, and classes.
Add supplemental User Account information.
Add Training Module download information.
Training Modules
Post slide decks from training modules to the Help tab (in addition to the video modules already
there).
General
Encourage users to call Customer Support to get specific information about issues with recording,
hearing, screen freezing, etc.
B. FIELD TEST
i. PURPOSE AND TIMING
The purpose of the Field Test was to perform a large‐scale test of ELPA21 item accessibility and performance
across various student populations: English Language Learners (ELLs), prior (exited) ELLs, students who were
screened and identified as English‐proficient, and students whose only language is English. (See Recruitment and
Participation for more information about targeted student demographics.)
ii. ANALYSIS GOALS AND SCOPE
Participation in the ELPA21 Field Test was voluntary and unlimited within states that were a part of the ELPA21
consortium. 14,721 students from eight ELPA21 states (Arkansas, Iowa, Kansas, Ohio, Oregon, Washington and
West Virginia) participated in the ELPA21 Field Test. Participation by state is indicated in the chart below.
ELPA21 Field Test Implementation Report
8
ELPA21 FIELD TEST PARTICIPATION BY STATE
State
Total Number of
Students who
Completed at least
One Domain
Total Number of
Students who
Completed all Four
Domains
AR 2012 1818
IA 1179 1101
KS 3151 2995
NE 1696 1499
OH 260 234
OR 2824 2565
WA 2856 2523
WV 743 719
Total 14,721 13,454
iii. RECRUITMENT AND PARTICIPATION
In September 2014, ELPA21 states were provided with communications materials to support recruitment for the
ELPA21 Field Test. These materials included email templates for districts and schools and web content for State
Education Agency (SEA) websites. Information about registration for the Field Test was also posted to
elpa21.org. The registration window officially opened on October 6, 2014.
Interested districts and schools were directed to the registration link, where they were asked to verify basic
state, district, and school contact information, and to supply information about the student population
participating in the Field Test or trial. In order make the best possible determinations about the performance of
our items, ELPA21 identified certain student categories for recruitment into our Field Test sample. These
population categories are described here:
● Current ELLs: Students who are currently receiving ELL services and who have not yet reached proficiency sufficient for program exit. (Target: 50 percent of sample)
● Prior ELLs: Students who tested proficient in prior academic years and who are being monitored. (Target: 25 percent of sample)
● Screened but English Proficient: Students who were administered an ELL screener and were deemed proficient as a result of the screener. A Home Language Survey may have indicated a home language other than English, but the student is proficient. These students are also called Initially Fluent (IFEP). (Target: 15 percent of sample)
● English Only: Students whose Home Language Survey results indicated they were not potential ELLs and students whose only language is English. (Target: 10 percent of sample)
Students with Disabilities: ELPA21 items were constructed to be maximally accessible to ELLs with
disabilities. To help generate data about how these items perform for these students, ELPA21 encouraged
ELPA21 Field Test Implementation Report
9
schools to include students with disabilities (both ELLs with disabilities and non‐ELLs with disabilities) in the
population taking the Field Test.
Field Test Training and Administration
The ELPA21 Field Test was administered as four separate domain testlets. Taken in its entirety, the ELPA21 Field
Test included all task types that were approved for use on the operational test. Progress was tracked as a
function of students completing each domain. To measure and monitor testing progress, Questar tracked test
starts and completions three times per week.
The field‐tested item bank inventory can be found in Appendix B.
C. TRAINING PROCESS
To prepare schools and districts for successful participation in the ELPA21 Field Test, ELPA21 developed a
comprehensive toolkit of training materials that included a series of eight training modules; a set of guides for
test administrators, test coordinators, and school‐based technical support staff; and a series of Interactive Demos
for use as practice tests.
The ELPA21 Field Test training toolkit was designed to allow test administrators, teachers, and state and local
education agency staff to preview the features of the assessment system; review administration procedures,
security features, and logistics; and ask questions.
i. TRAINING MODULES
The ELPA21 Field Test training toolkit included the following eight training modules:
1. The Field Test and Platform and System Trial Overview Training Module:
● Described the Field Test
● Presented the testing platform
● Provided a basic walk‐through
● Demonstrated how to start a test
● Gave users a general sense of what students would experience
2. Trial Orientation:
● How to use the Test Administration Site and dashboard
● An overview of the platform (not test items)
● Troubleshooting
● How to complete the post‐trial survey
3. Student Testing Session:
● How to register individual students
● Roster changes
● How to modify personal needs profiles
ELPA21 Field Test Implementation Report
10
● How to use Interactive Demos
● Starting the student testing session
● Directions for administration
● How to pause and restart a testing session
4. Accessibility Features and Accommodations:
● Overview of accessibility features (universal and designated)
● Overview of accommodations
● How to create personal needs profiles: How to enter Individualized Educational Program (IEP) and 504
plans
● Student platform tools
5. Testing Lab Management:
● Security measures
● Bulk Upload/Pre‐ID File Upload
● Adding, modifying, and deleting school accounts
● System reports
6. Workstation Preparation:
● Workstation readiness
● Secure browser setup
● Safeguards against data loss
● Refreshing a workstation between testing sessions
7. Troubleshooting:
● Troubleshooting common issues
● Error prevention
● How to get help
8. Student Testing Experience:
● Walk‐through of the testing tools
● Interactive Demos and lesson plans
● How to navigate the test, pause and resume testing, and how to end a testing session and submit
The training modules could be viewed individually or as a set and could be viewed in any order.
Recommendations regarding audience were provided for each module, as they were designed to meet the
needs of a variety of specific audiences (District Test Coordinators, School Test Coordinators, School Technology
Coordinators, Test Administrators, Classroom Educators, etc.).
ELPA21 Field Test Implementation Report
11
ii. FIELD TEST ADMINISTRATION AND TECHNICAL MANUALS
ELPA21 provided field test administration and technical manuals to prepare examiners, coordinators, and
information technology specialists for the administration of the Field Test.
ELPA21 Field Test Administrator’s Manual: Guided administrators in pre‐test preparations for students, such as
how to deliver the Interactive Demos, how to log students in to the testing platform, directions for
administration and instructions on testing tools and features, and what assistance an administrator could give to
students during the Field Test.
ELPA21 Field Test Coordinator’s Manual: Guided administrators on how to log in to the administrative platform,
upload Pre‐ID files, print student login tickets, and add or edit test administrator and student records.
ELPA21 Field Test Setup and Installation Guide: Offered instructions to set up and configure workstations and
tablets for Questar’s student test delivery system.
ELPA21 Field Test Accessibility and Accommodations Manual: Explained the ELPA21 accessibility and
accommodations framework, the selection and administration of universal and designated features, and
accommodations for individual students to produce valid assessment results.
iii. INTERACTIVE DEMOS
ELPA21 developed Interactive Demos (practice tests) to allow students, educators, and administrators at
participating schools to become familiar with the ELPA21 test environment, question format, tools, and
response types. Each demo included a lesson plan and teacher script. The grade‐band‐specific Interactive Demos
covered reading, writing, listening, and speaking items and test directions.
iv. HARDWARE AND HEADSET SPECIFICATIONS
The hardware specifications for the Field Test were determined based on the most common devices found in
schools by the PARCC and Smarter Balanced consortia. ELPA21 provided Questar with data about the hardware
specifications of devices in ELPA21 districts and schools. In return, Questar provided the technology
requirements of their online delivery platform along with recommendations based on the ELPA21 item types
and test design. ELPA21’s Field Test and Technology Task Management Team (FTT TMT), made up of state
education agency (SEA) representatives and technology experts, took the information provided and worked
iteratively with states and Questar to arrive at the final hardware requirements.
The development of headset specifications was more challenging, as only a few states were already using
headphones or headsets for testing; thus there was limited school inventory data available. Questar provided
the minimum requirements needed for their platform and ELPA21 surveyed states about usage. The FTT TMT
then explored options weighing four factors: affordability, appropriate size, availability, and the ability to be
used across devices. States were provided with minimum specifications, a suite of guidance documents, and a
variety of suppliers to aid in their procurement decisions.
ELPA21 Field Test Implementation Report
12
For links to additional technical support resources provided to educators, administrators, and families prior to
the ELPA21 Field Test see Appendices A, C and D.
III. FIELD TEST SURVEY
After the conclusion of the Field Test, ELPA21 administered an optional 51‐question survey to participating
educators. The purpose of the ELPA21 Field Test survey was to collect educator feedback on the following:
Training Materials
Accessibility and Accommodation Manual
Interactive Demos
Test Administration Site
Test Items
Customer Support
Communications
Test Administration
Role‐Specific Questions Most of the survey questions allowed respondents to provide detailed, open‐ended feedback, while others were
simple yes/no questions.
A full list of survey questions can be found in Appendix H.
A. PROCESS FOR DEVELOPING SURVEY QUESTIONS
ELPA21’s Task Management Team leads held a series of discussions in early 2015 to develop the questions for
the post‐Field Test survey. The goal was to develop a survey that could be completed by educators and
administrators in less than 30 minutes and that would provide ample qualitative and quantitative data about
participants’ experiences during testing. Responses to the post‐Field Test survey were collected by Questar
within one month of the Field Test window closing, and were then systematically analyzed and categorized by
ELPA21’s Task Management Teams (TMTs).
B. SURVEY PARTICIPANTS AND OUTREACH
A total of 512 classroom teachers, test administrators, and district/school test coordinators responded to the
survey. All participants in the Field Test were invited to participate in the survey. The table below outlines the
frequency distribution of responses across states.
ELPA21 Field Test Implementation Report
13
FREQUENCY DISTRIBUTION OF RESPONDENTS ACROSS STATES
State N percent
Arkansas 90 17.3
Kansas 92 17.7
Iowa 65 12.5
Nebraska 55 10.6
Ohio 20 3.9
Oregon 83 16.0
Washington 80 15.4
West Virginia 27 5.2
Total 512 100
C. GENERAL QUALITATIVE FEEDBACK
ELPA21 received substantial qualitative feedback from educators who participated in the Field Test. General
themes from feedback included the interactivity of the assessment, increased student engagement, and the
challenges associated with the transition to online testing. Below are quotes from educators who provided
feedback via the survey:
Positive Feedback
“100% of my students agreed they liked this format better than the pencil/paper option.”
“Students LOVE technology! They were engaged during the entire test! Well done.”
“Students loved the new test. Having different ways to answer questions changes it up for
them.”
“Items were more developmentally appropriate.”
Challenges and Suggestions for Improvement
“Some aspects of the test platform were challenging for students who are not familiar with the
technology.”
“Recording spoken responses was challenging at times.”
“I was overwhelmed by the amount of resources and materials provided prior to the Field Test
and didn’t know what to read first.”
D. MAJOR AREAS OF FEEDBACK AND RECOMMENDATIONS FOR OPERATIONAL ADMINISTRATION
After the close of the Field Test survey, ELPA21’s TMTs worked to systematically analyze and categorize the
feedback that was received. Responses to survey questions were divided into categories and were combined
with feedback that had been received from educators during item review, content and bias review, passage
review, and site visits. The categories of feedback included the following:
ELPA21 Field Test Implementation Report
14
i. Training Processes and Materials
ii. Test Platform and Technical Issues
iii. Accessibility and Accommodations
iv. Communications
v. Item Specific Issues
vi. Helpdesk
After the feedback was reviewed, ELPA21’s TMTs developed detailed recommendations for ways to address any
issues or concerns raised during the Field Test. These recommendations will be provided to ELPA21’s
operational test vendors to ensure that all feedback and recommendations resulting from the Field Test are
incorporated into plans for the first operational administration of the ELPA21 summative assessment in school
year 2015‐2016. What follows is a summary of those recommendations and next steps.
Additional details can be found in Appendix E and Appendix F.
i. TRAINING PROCESSES AND MATERIALS
Quantity of Supporting Materials: Some feedback indicated that educators felt overwhelmed by the quantity of
supporting resources provided. As a result, our teams are working to reduce redundancies and ensure the clarity
of all resources developed to support the first operational administration of the ELPA21 summative assessment.
ELPA21 is using a templated approach to provide operational test vendors with all support document
templates that can be customized to fit each unique test platform.
ELPA21 recommends that operational vendors perform user testing on the administration materials in
conjunction with platform user testing to ensure a close, clear match of manual instructions with the
platform as experienced by the user.
Interactive Demos: One of the key takeaways from the Field Test was that the Interactive Demos are essential
to ensuring that a lack of prior exposure to technology does not impede students’ ability to demonstrate English
language proficiency.
ELPA21 will work with states’ operational vendors to ensure that the ELPA21 Interactive Demos are
easily accessible to all participating educators, and that they are fully representative of the ELPA21
assessment system. Additionally, ELPA21 will work with states’ operational vendors to ensure that the
accompanying lesson plans are useful resources for educators to use in their classroom prior to testing,
to ensure that all students are familiar with the technology used to administer the assessment.
ii. TEST PLATFORM AND TECHNICAL ISSUES
Test Administration Directions: Some educators indicated that the directions were too long to keep students’
attention.
ELPA21 Field Test Implementation Report
15
ELPA21’s TMTs will work with vendors to review the current set of test administration directions and
consolidate when possible.
Student Test Platform Tools: Some educators reported that students forgot the intended use for different
platform tools after the initial explanation.
ELPA21 advises operational test vendors to develop a projectable or printable key that students can use during testing to ensure that no student is disadvantaged as a result of the technology used for test
administration.
Administration Platform: Many educators expressed that the Questar Administration Platform’s user roles and
permissions hierarchy did not parallel the actual roles and responsibilities of ELL educators and test
administrators in ELPA21 states.
ELPA21 encourages operational test vendors to work closely with state counterparts to ensure that they build out user roles and permissions to more closely match the existing state structures.
Proctor Password: This refers to the password used by test administrators to restart a student’s test after a test
“time‐out”. Many educators reported that they were not aware of the location of the proctor password, or its
purpose during testing.
ELPA21 recommends that operational vendors ensure that both the purpose and location of the
proctor password are clearly indicated in the test administration manual.
Translated Directions: Some educators requested that test directions be translated to additional languages.
This request will be evaluated on a state‐by‐state basis.
Test Tickets: Some of the younger test‐takers struggled with lowercase letters in passwords, as keyboard letters
are all capital letters.
ELPA21 recommends that vendors do not require students (particularly in K‐5) to have usernames and
passwords with both upper and lowercase letters. This was confusing to many students during the Field
Test and required much teacher assistance.
iii. ACCESSIBILITY AND ACCOMMODATIONS
Universal Features, Designated Features, and Accommodations: 15 percent of survey respondents indicated
they did not have a clear understanding of the differences between the embedded and non‐embedded universal
features, designated features, and accommodations available on the ELPA21 assessment.
ELPA21 Field Test Implementation Report
16
ELPA21’s Administration, Accessibility and Accommodations Task Management Team will refine the
Accessibility and Accommodations Manual to provide additional clarity.
Read Aloud and Scribe Guidelines: In their comments, some survey respondents indicated the need for read‐
aloud and scribe guidelines to inform the process of administrating related accessibility features and
accommodations.
ELPA21’s Administration, Accessibility and Accommodations Task Management Team has drafted
read aloud and scribe guidelines highlighting the background of each support, qualifications of
individuals administering these supports, preparation procedures, specific guidelines, and post‐
administration considerations. These will be included in the operational version of the Accessibility and
Accommodations Manual.
iv. COMMUNICATIONS
Email Communications: While many survey respondents indicated that ELPA21’s email communications worked well, others commented that they were received too frequently.
The use of email to update schools and districts will continue, however ELPA21’s communications team will ensure that communications are concise in an effort to reduce any perceived redundancies.
Sharing Resources: Some educators commented that it was difficult to know where to find all of the supporting documentation for the ELPA21 Field Test, as there were many different places to look for the information (the test vendor’s administration website, elpa21.org, SEA webpages, etc.).
ELPA21’s communications task force will work with states and operational vendors to streamline resources and communicate effectively about where to find them.
v. ITEM SPECIFIC ISSUES
Navigation: Many educators reported that students struggled with some platform features including excessive
scrolling, the 2‐pane screen that required toggling, and answer options that appeared hidden.
ELPA21 is working with Questar to reformat items to resolve display issues and vendors will receive
items with updated code to ensure that all issues are resolved prior to the first operational
administration.
Item Difficulty: In a few surveys, educators reported that in observing the testing session, they saw items that
appeared to be too difficult or not at grade level for the students taking the test or items that appeared to be in
the wrong domain testlet. This was a feature of the design of the Field Test, in which test forms were
constructed to maximize exposure of items and to provide linking data between forms. The Field Test also
contained items designed for the ELPA21 screener, some of which relied on more than one domain. On the Field
Test, students may have seen these experimental items, partial sets of items, or linking items from adjacent
grade bands; operational test forms will not have these features. Feedback regarding items that fell into these
ELPA21 Field Test Implementation Report
17
categories was carefully reviewed and documented, but does not require a follow‐up action for operational
administration.
Additional information about item performance will be provided in the Field Test Technical Report.
vi. HELPDESK
Throughout the duration of the Field Test, Questar hosted a live helpdesk that could be contacted either by
email or phone during business hours. ELPA21 developed a dynamic FAQ based on calls that came in to the
helpdesk, which was updated on a daily basis until the Field Test ended.
The majority of calls and emails the helpdesk received can be categorized into the following groups:
Pre‐ID upload issues
Chromebooks, iPads
Questions about dates of administration
General information inquiries
Paper and pencil testing (Grades K and 1)
Policy questions
Proctor password location
Access to resources
According to the survey responses received, about 80 percent of respondents indicated that any issues they
reported to the Questar helpdesk were resolved on the first call or within 24 hours. Survey comments regarding
the helpdesk were overwhelmingly positive.
● “I was very impressed with all of the representatives. They provided a ticket number, quick responses to
questions and returned calls immediately. Definitely a major advantage for using the test.”
● “The customer support was excellent. She even predicted questions that I would have and helped me!”
● “They were easy to contact, very helpful and polite, and it was nice to talk to and get e‐mails from a real
person. My questions were solved quickly and easily, without me feeling stupid for asking.”
Based on feedback received throughout the Field Test and via the survey, ELPA21 recommends the following to
states’ and their operational platform vendors:
School‐based administrations and educators: ELPA21 strongly recommends that school‐based test
administrators coordinate with technology directors prior to testing to ensure that an in‐school
technology assistant is on hand during test administration, particularly during the log‐in process.
Operational vendors: Same‐day resolution of technical issues is critical for testing success. Many
times educators have limited opportunities to step outside of the classroom to report technical issues, so
it is beneficial to resolve as many issues as possible during the course of a single call.
States: Ensure that your operational vendor has a state‐based contact who can be quickly reached in order to resolve any questions related to state policy.
ELPA21 Field Test Implementation Report
18
IV. CONCLUSION
a. REFLECTION
ELPA21’s Field Test was an excellent proving ground for ELPA21 items, training and support materials, and student practice opportunities. The development of a new, innovative assessment system based on progressive new ELP standards offers the opportunity to advance the field and better support ELLs. The ELPA21 Field Test was a culminating event that represents a massive, coordinated effort among project staff and vendors, state, district and school‐based educators and administrators, industry experts and thought leaders, and our students. b. PLANNED SUPPORT FOR YEAR ONE For school year 2015‐16, ELPA21 states will contract individually with assessment platform vendors to deliver
the ELPA21 summative assessment in their states. The Council of Chief State School Officers (CCSSO) created this
report to identify trends in Field Test feedback and to provide actionable guidance to states and their platform
vendors.
C. RECOMMENDATIONS FOR STATES
As a result of the Field Test, ELPA21 identified the following major themes and takeaways:
1. Opportunities for students to practice are critical. ELPA21 task types may not be familiar to students and
newly‐arrived ELLs may not have the necessary keyboarding or mousing skills to access the assessment.
Suggestions for increasing student familiarity with the test include orienting students to test directions
and testing tools, allowing students to practice using Interactive Demos, working closely with students
using the provided lesson plans, and treating practice test items as performance tasks for whole‐class
discussion. Platform vendors should ensure that their training program emphasizes multiple, student‐
specific opportunities for practice prior to the operational administration of the summative assessment.
2. Classroom educators and test administrators need practice too. Test administrators who reviewed
training materials well in advance reported significantly fewer issues than did their colleagues who were
not able to prepare in advance. States are advised to work closely with their vendors to roll out training
components with enough advance time and support to allow educators to thoroughly familiarize
themselves with administration guides, directions for administration, student log‐in procedures and
troubleshooting, Interactive Demos, and lesson plans several weeks ahead of students’ practice time
and testing appointments.
3. Communication needs to be well planned. Many respondents commented that there were too many
emails from too many sources. States and districts should rely on established communication channels
and intervals to distribute information about the new assessment, including memoranda and bulletins,
thoughtfully schedule email blasts, regular updates to assessment department webpages, and
established social media protocols. Digital resources such as administration manuals should be stored in
familiar, easily accessible locations, such as on a state or district’s ELPA21 webpage.
4. Platform presentation of the items needs to be accessible and consistent. While ELPA21’s items rely on
standard QTI2.1 coding and meet APIP standards, certain interactions may require vendors to manually
change code to render displays properly in the testing platforms. Vendors will be provided with a QTI
and APIP manual and item specifications, as well as a set of sample items that demonstrate the correct
format and performance of items. States are advised to work closely with platform vendors and Questar
ELPA21 Field Test Implementation Report
19
to ensure that items display as designed. Vendors should schedule in‐platform item reviews so that state
contract managers can review and sign off on items prior to their deployment into test forms.
5. Flexibility may be needed in scheduling testing appointments. Recommended administration times are a
guideline and may not accurately describe all administration scenarios. In certain classroom setups,
testing stations may be too close together to provide a distraction‐free space, especially during the
listening and speaking portions of the assessments. Students with IEPs may need longer breaks and
other personalized supports that may affect the testing environment and scheduling. Districts and
schools should take these needs into account when scheduling testing sessions.
6. Timely help desk support is critical. In most school‐based settings, educators can be hard to reach during
the school day, and testing schedules do not typically allow much flexibility for dealing with technology
issues. Therefore, it is critical that vendor helpdesks take all means available to resolve a user’s issue
after the initial contact. Leaving an issue unresolved after initial contact may result in a student’s testing
session being rescheduled or suspended. Vendors should also staff helpdesks at a level that supports
short hold times and renders it unnecessary for users to leave a voicemail during regular hours.
These trends and remedies are discussed in detail in the body of this report and will be deployed across the suite
of templated materials ELPA21 will provide to vendors to support training and administration.
V. ACKNOWLEDGEMENTS
ELPA21 would like to acknowledge and thank the following individuals who contributed to the development of
this report:
ELPA21 Task Management Team Leads
Bill Auty
Wes Bruce
Laurene Christensen
Mark Hansen
Kara Schlosser
Mary Seburn
Vitaliy Shyyan
Martha Thurlow
Phoebe Winter
CCSSO
Margaret Ho
Lauren Lynch
Cathryn Still
APPENDIX A: ADDITIONAL RESOURCES PROVIDED TO EDUCATORS, ADMINISTRATORS AND FAMILIES PRIOR TO THE ELPA21 FIELD TEST: TRAINING AND TECHNICAL SUPPORT
20
Throughout the duration of the Field Test, ELPA21’s Field Test and Technology (FTT) Task Management Team
(TMT) worked collaboratively with Questar and the ELPA21 Communications Task Force to develop the following
support materials. These can also be found on elpa21.org/fieldtest.
ELPA21 State/District/School Test Administrators' Quick Start Checklists
Field Test FAQ for Families
Field Test Administration and Technical Manuals overview
Field Test and Platform and System Trial Training Modules Overview
ELPA21 Field Test FAQ
ELPA21 calendar of events (generic)
ELPA21 Hardware Specifications
ELPA21 Headset Specifications
Additional Headset Information
ELPA21 Headset Kits: Tips and Tricks
ELPA21 Field Test v. Platform and System Trial Chart
APPENDIX B: ELPA21 FIELD TEST ITEM BANK INVENTORY TABLE
21
Grade Band
Domain Items to be Developed for Review Items to be Field‐Tested
L S R W L S R W
K
SR Items 90 114 75 95
TE Items 30 30 84 25 25 70
SCR Items 124 99
ECR Items 6 5
Total 120 130 144 84 100 104 120 70
1
SR Items 96 114 80 95
TE Items 24 42 90 20 35 75
SCR Items 74 59
ECR Items 6 5
Total 120 80 156 90 100 64 130 75
3‐Feb
SR Items 77 98 64 82
TE Items 19 36 48 16 30 40
SCR Items 55 20 44 16
ECR Items 8 13 6 10
Total 96 63 134 81 80 50 112 66
5‐Apr
SR Items 101 96 24 84 80 20
TE Items 19 31 29 16 26 24
SCR Items 60 5 48 4
ECR Items 18 13 14 10
Total 120 78 127 71 100 62 106 58
8‐Jun
SR Items 101 98 38 84 82 32
TE Items 19 34 16 28
SCR Items 55 10 44 8
ECR Items 8 13 6 10
Total 120 63 132 61 100 50 110 50
12‐Sep
SR Items 101 105 38 84 88 32
TE Items 19 35 16 29
SCR Items 55 10 44 8
ECR Items 8 13 6 10
Total 120 63 140 61 100 50 117 50
Totals Per Domain 696 477 833 448 580 380 695 369
Grand Total 2,454 2,024
APPENDIX C: ELPA21 HEADSET SPECIFICATIONS
22
The table below outlines features for headsets and Questar’s rationale in recommending those features. Please
note that Questar does not endorse specific brands or devices.
Recommended Features
Reason for Recommendation Alternatives not Recommended
Device: Headset with attached
microphone
A headset with attached microphone allows for recording and playback using the same device.
Separate headphones and microphone increase the need to ensure proper connection and setup on the computer and complicate the
testing site set‐up.
Headset Design: Over‐ear headphones
Over‐ear headphones are comfortable to be worn for a longer period of time by students of different ages. Weight and size of headphones
can be selected based on students’ age. Portable headphones are smaller and lighter
and hence may be suitable for younger students.
For the general population, in‐ear headphones (ear buds) that are placed directly in the ear
canal are more difficult to clean between uses. They may not be suitable for younger students. Ear bud microphones are attached to the cord, making capturing the students’ voice more
problematic.
Play Back Mode: Stereo
The sound files of the assessment are recorded and played back in stereo.
Mono headsets
Noise Cancellation Feature:
None needed
Noise cancellation often does not cancel out the sound of human voices.
Many headsets with a noise cancellation feature require a power source; this adds cost
and complicates the testing set‐up.
Volume Control Feature:
None needed
Students will be able to control headset volume via the testing device (computer, laptop, tablet, etc.) and there is an audio check built into the
Questar platform.
When using headsets with a built‐in volume control feature, students may accidentally adjust headset volume or mute themselves.
This feature also adds to cost.
Type of Connector Plug:
One 3.5 mm plug, two 3.5 mm plugs or USB
connector
Headsets must be compatible with the computer used for testing.
Headsets with two 3.5 mm plugs can be converted with a “Y” adaptor to a single plug
and be used with tablets. USB‐connected headsets require driver
installation and may need to be enabled as playback/recording device.
NOTE: USB‐connected headsets for iPads and Android tablets are not available/supported.
Bluetooth is not recommended.
Special Considerations Alternative‐size headsets may need to be considered for smaller children, although are not required. Ear buds with a microphone may be considered for students who wear head scarves or wraps, etc.
*Questar recommends avoiding microphones with windscreens (e.g. foam covers) unless they are removable, to assist with
cleaning between uses.
APPENDIX D: ELPA21 HARDWARE SPECIFICATIONS
23
The ELPA21 assessments can be delivered on desktops, laptops, iPads, Android tablets and Chromebooks that meet the following specifications:
Application Operating System
OS Version Processor System Memory
Hard Disk Space
Screen Size Resolution
LAN Network Internet Speed Additional
Requirements
Secure Browser
Windows
XP/Vista/7/8/2003/2008 (latest service pack) (NOTE: Windows 8 RT is not supported)
Intel Pentium 4 1.0 GHz equivalent or higher CPU Recommended Intel Core 2 Duo 1.6 Ghz equivalent or higher performing CPU or higher performing CPU
Minimum 256MB Free RAM Recommended 512 MB Free RAM
Minimum 1 GB Free Storage Space
Minimum 10" class screen size (10" class is 9.5 to 10.5 inches) Minimum 1024 X 768 screen resolution Recommended 12" or larger screen size
Minimum 100 Mbps LAN/802.11g Wireless 54 Mbps or greater Minimum available LAN bandwidth at each workstation: 1 Mbps Recommended 1 Gbps LAN/802.11n Wireless 150 Mbps or higher Recommended available LAN bandwidth at each workstation 2 Mbps
Minimum per device: 150 kps (Minimum with proctor caching 30 kps) Recommended: 300 kps A secure browser will be used. If more than 100 students testing simultaneously, proctor caching will be required.
Device compatible headset with built‐in microphone, with either standard headset plug(s) or USB connection.
Mac 10.6/10.7/10.8/10.9/10.10
Intel Core 2 Duo 1.6 GHz equivalent or higher performing CPU
Minimum 256MB Free RAM Recommended 512 MB Free RAM
Minimum 1 GB Free Storage Space
Minimum 10" class screen size (10" class is 9.5 to 10.5 inches) Minimum 1024 X 768 screen resolution Recommended 12" or larger screen size
Minimum 100 Mbps LAN/802.11g Wireless 54 Mbps or greater Minimum available LAN bandwidth at each workstation: 1 Mbps Recommended 1 Gbps LAN/802.11n Wireless 150 Mbps or higher Recommended available LAN bandwidth at each workstation 2 Mbps
Minimum per device: 150 kps (Minimum with proctor caching 30 kps) Recommended: 300 kps A secure browser will be used. If more than 100 students testing simultaneously, proctor caching will be required.
Device compatible headset with built‐in microphone, with either standard headset plug(s) or USB connection.
Linux
Ubuntu 11.10/Fedora 14 (NOTE: Debian 9 / OpenSUSE 11.1 are under review)
Intel Pentium 4 1.0 GHz equivalent or higher performing CPU Recommended Intel Core 2 Duo 1.6 Ghz equivalent or higher performing CPU
Minimum 256MB Free RAM Recommended 512 MB Free RAM
Minimum 1 GB Free Storage Space
Minimum 10" class screen size (10" class is 9.5 to 10.5 inches) Minimum 1024 X 768 screen resolution Recommended 12" or larger screen size
Minimum 100 Mbps LAN/802.11g Wireless 54 Mbps or greater Minimum available LAN bandwidth at each workstation: 1 Mbps Recommended 1 Gbps LAN/802.11n Wireless 150 Mbps or higher Recommended available LAN bandwidth at each workstation 2 Mbps
Minimum per device: 150 kps (Minimum with proctor caching 30 kps) Recommended: 300 kps A secure browser will be used. If more than 100 students testing simultaneously, proctor caching will be required.
Device compatible headset with built‐in microphone, with either standard headset plug(s) or USB connection.
Android* Tablets
4.2, 4.3, 4.4 for the following supported devices: o Google Nexus 10 o Samsung Galaxy Tab 2 (10.1) o Samsung Galaxy Note (10.1) o Motorola Xoom o Motorola Xyboard An external keyboard is required for the assessment.
1.0 Ghz dual core equivalent or higher
Minimum 256MB Free RAM Recommended 512 MB Free RAM
Minimum 256MB Free Storage Space
Minimum 10" class screen size (10" class is 9.5 to 10.5 inches) Minimum 1024 X 768 screen resolution
Minimum Wireless 54 Mbps or greater Minimum available LAN bandwidth at each workstation: 1 Mbps Recommended 802.11n Wireless 150 Mbps or higher Recommended available LAN bandwidth at each workstation 2 Mbps
Minimum per device: 150 kps (Minimum with proctor caching 30 kps) Recommended: 300 kps A secure app will be used. If more than 100 students testing simultaneously, proctor caching will be required.
Device compatible headset with built‐in microphone, with either standard headset plug(s) or USB connection.
IOS iPad Tablets
iOS 6, 7, 8 An external keyboard is required for the assessment.
1.0 Ghz dual core equivalent or higher
Minimum 256MB Free RAM Recommended 512 MB Free RAM
Minimum 256MB Free Storage Space
Minimum 10" class screen size (10" class is 9.5 to 10.5 inches) Minimum 1024 X 768 screen resolution
Minimum Wireless 54 Mbps or greater Minimum available LAN bandwidth at each workstation: 1 Mbps Recommended 802.11n Wireless 150 Mbps or higher Recommended available LAN bandwidth at each workstation 2 Mbps
Minimum per device: 150 kps (Minimum with proctor caching 30 kps) Recommended: 300 kps A secure app will be used. If more than 100 students testing simultaneously, proctor caching will be required.
Device compatible headset with built‐in microphone, with either standard headset plug(s) or USB connection.
Chrome OS
Chromebooks
Versions 29 through 37 (NOTE: Chrome OS manages and controls the upgrade process. We will review and confirm upgrades as they are released.)
1.6 Ghz equivalent or higher
Minimum 256MB Free RAM Recommended 512 MB Free RAM
Minimum 256MB Free Storage Space
Minimum 10" class screen size (10" class is 9.5 to 10.5 inches) Minimum 1024 X 768 screen resolution
Minimum Wireless 54 Mbps or greater Minimum available LAN bandwidth at each workstation: 1 Mbps Recommended 802.11n Wireless 150 Mbps or higher Recommended available LAN bandwidth at each workstation 2 Mbps
Minimum per device: 150 kps (Minimum with proctor caching 30 kps) Recommended: 300 kps A secure app will be used. If more than 100 students testing simultaneously, proctor caching will be required.
Device compatible headset with built‐in microphone, with either standard headset plug(s) or USB connection.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
24
ELPA21 Field Test Survey Responses: Identified Issues and Next Steps
Subject Description of Issue Next Steps
Accessibility Braille use and ASL;
ASL and Braille were not features of the ELPA21 Field Test, but ELPA21 will be working with operational vendors on approaches to including these supports during the operational administration.
Admin Directions Needs more complete directions for how to add students manually.
ELPA21 will work with operational platform vendors to ensure that robust administrative directions are included in the Test Administrator’s Manual (TAM).
Admin Directions However, I feel that there need to be some more specific directions regarding the speaking questions, such as the time limit written below the question.
ELPA21 will work with operational platform vendors to ensure that robust directions are included in the TAM.
Admin Directions
The directions for the tools are way too long for young students. They might be better if the directions are screens that students go through one at a time so they can actually see what the proctor is talking about.
ELPA21 will work with operational vendors to discuss the possibility of developing a “guide to ELPA21 tools” one‐pager that could be projected during testing (or printed out for individuals if necessary).
Admin Directions Also, the practice test was short so they were not able to fully adjust to the new format previous to taking ELPA 21.
ELPA21 will work with platform vendors to ensure that the Interactive Demos cover the full range of questions and provide sufficient practice opportunities for students to feel comfortable with the technology. ELPA21 also encourages educators to use the demos as many times as necessary for students to feel comfortable.
Admin Platform
Login information needs to be streamlined. I had a login/password for district coordinator, building coordinator at three different buildings, and school test administrator at three buildings. Let the district coordinator enter names and check which roles that person will be fulfilling, then give them ONE login and password that accesses the levels of information they need.
Questar and other operational vendors will discuss the best approaches to developing a useful permissions hierarchy. Educators managing testing in multiple schools should be able to manage those through a single username/password.
Admin Platform
It is very 'busy' and it was at times difficult to find what I was looking for. There are too many things in obscure places and under too many different menus. I think some of the resources could be combined on the same page or organized better as it often took me a couple of extra clicks to find what I needed.
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform I love how I could monitor students' progress during the testing
ELPA21 will maintain monitoring of student progress during testing.
Admin Platform There seemed to be too many topics under the HELP section.
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform Setting up classes was a bit more challenging than I thought it would be.
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform Trouble locating student log in information and testing status, after a student had begun the test and was resuming the following day or the same day.
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform
When I tried to get in and change things (groupings) for my school it was a challenge. After I got things switched around for me to manage my own school things went much smoother.
QAI and other operational vendors will discuss the best approaches to developing a useful permissions hierarchy. Educators managing testing in multiple schools should be able to manage those through a single username/password.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
25
Admin Platform Vague, poorly worded, needs an editor much more of a hassle to use than other test systems
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform I spent a great deal of time reading during the testing session because of various documents, etc.
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform It was easy after I had called the help center to figure out how to add students etc.
Helpdesk services for the operational assessment will be provided by states’ platform vendors.
Admin Platform
The fact that teacher log‐ins can only have one school made it difficult to maneuver two different log‐ins. I couldn't access the Proctor Password unless I called Questar. I teach at various schools so this was especially frustrating.
QAI and other operational vendors will discuss the best approaches to developing a useful permissions hierarchy. Educators managing testing in multiple schools should be able to manage those through a single username/password.
Admin Platform
Clearly make the labels for where the classroom administration of the assessment can be found. This is the area where the students are seen as "in progress" of taking the test.
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform Generally, the interface worked well. This is helpful feedback.
Admin Platform
Organizing students by classroom is helpful to set up and plan for testing, but not useful to monitor testing if students from different classrooms are in the same testing session.
QAI and other operational vendors will discuss the best approaches to developing a useful permissions hierarchy. Educators managing testing in multiple schools should be able to manage those through a single username/password.
Admin Platform I thought it was strange that all training materials were under help rather than a tab that said training materials.
QAI and other operational vendors will discuss the best approaches to streamlining the administrative website. These comments will be taken into account when optimizing the layout.
Admin Platform
I had a hard time finding my students after the upload. I then discovered that I had to put a check mark in the View students across all subjects and students not assigned to a class box on the student tab. Next, I had to click on view/edit student, scroll to the bottom of each student, find my name to click on and then they were added to my class. I did not find these specific instructions in the manuals and customer support could never fully answer my question. I was able to solve this problem by taking time to look through the site and figure it out by myself.
QAI and other operational vendors will discuss the best approaches to developing a useful permissions hierarchy. Educators managing testing in multiple schools should be able to manage those through a single username/password.
Admin Platform
Also, it would be nice to know how far along students are in the assessment. For example, as a proctor, if I know that a student is on the last problem, I might keep him or her to finish the test instead of releasing them back to class and then pulling them from instruction a 2nd or 3rd time to maybe only do one question.
ELPA21 will discuss the feasibility of developing such a feature/dashboard with operational vendors.
Admin Prep Condense info ‐ there was too much! I needed something to take me through the materials and orient me more simply.
ELPA21's communications team is aware of this request and will be working with operational test vendors to condense information.
Admin Prep It would be beneficial to have the test administration manuals separate for each grade band.
While unlikely, ELPA21 will discuss the feasibility of developing grade specific TAMs with operational vendors. Alternatively, it may be beneficial for educators to print only the elements of the TAM that pertain to the particular grade (band) with which they are working.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
26
Admin Prep
The easily printable sign‐in sheets with full class sign‐in info were an outstanding and very useful idea. It would be wonderful if the proctor password could be included on the sheets.
Operational vendors will be made aware of this recommendation.
Admin Prep I think there should be a recommended guidance of how many tests students should take on a given day
States will discuss the development of such a recommendation with their operational vendors.
Admin Prep There was a SET button on the screen that I never did see what it did and no one in my district knew either
Operational vendors will be made aware of this issue.
Admin Prep The Administrators Manual had some wording that was inconsistent with what students were seeing on the screen concerning login.
Operational vendors will be made aware of this issue.
Arrows The arrows at the top and bottom of the page need to be clearly identified so the user understands the functionality of the arrows. (expandable passage)
Operational vendors will be made aware of this issue.
Bandwidth Videos would not load Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Bandwidth The Internet was too slow. Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Bandwidth Also, slow process when loading the test. Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Bandwidth Also, some students had to wait about 15 minutes before the test loaded.
Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Bandwidth We were unable to use the caching system however. Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Bandwidth
Students had to wait quite a bit, when trying to operate the interface. There was a significant delay for the next screen to appear at times. Students thought something was wrong and would click, trying to resolve the problem.
Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Bandwidth/Test Platform
We had huge computer issues trying to access the different parts of the test. We were kicked out of the test or it froze up numerous times. We were never able to access the listening section so none of my students completed it.
Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Basic Navigation When I didn't have a bookmark or a link, it was difficult to find the site. I had to dig through the informational website to find it.
Schools should ensure that this information is stored in an easy‐to‐access space, prior to testing.
Basic Navigation It was quite different than the other ELPA and so I was looking for things that did not appear to exist.
Schools should assess how the ELPA21 test will perform with their tech set‐up prior to testing. (via the Interactive Demos)
Chrome Books
Our IT person indicates a district level procedure for being able to use Chromebooks is possible and recommends that process rather than every chrome book needing to be handled.
Operational vendors will be made aware of this issue. They need to ensure that the requirements for setting up different devices are clear to schools and districts.
Chrome Books Chromebooks needed a 2‐week lead time for preparation for test administration. Set up for use was very challenging
Operational vendors will be made aware of this issue. They need to ensure that the requirements for setting up different devices are clear to schools and districts.
Chrome Books small screen size made navigation hard Operational vendors will be made aware of this issue so that they can develop special guidelines for using Chromebooks during testing.
Chromebooks Not a lockdown browser on half of the computers used. Students could access the internet during testing.
Operational vendors will need to ensure that their browsers are secure on all devices.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
27
Chromebooks
Our students use Chromebooks. They were not able to access the Listening section of the test. It just froze after checking the headset. We got to the arrow and it just spun.
Operational vendors will be made aware of this issue so that they can develop special guidelines for using Chromebooks during testing.
Chromebooks
The chrome books kept kicking the students off the ELPA testing system Our computers have network sharing and when they logged in to their own accounts, the Questar system would not load. (These issues each took 45 minutes to solve).
Operational vendors will be made aware of this issue so that they can develop special guidelines for using Chromebooks during testing.
Chromebooks
When the Chromebook asked students to "Allow" access to the headphones, if they hit ANYTHING before they hit Allow, the recording wouldn't work. They had to completely sign out of the testing platform and get back into it to be able to continue.
Operational vendors will be made aware of this issue so that they can develop special guidelines for using Chromebooks during testing.
Communication I just received items from the district office as needed…either via email or through district mail. It would have been helpful to have all items at one time.
QAI and other operational vendors will discuss the best approaches to developing a useful communications hierarchy. Educators managing testing in multiple schools should be able to manage those through a single username/password, and similar should only receive a single piece of communication about any updates.
Communications
The manuals were very helpful, but until the test started, I wasn't sure about quite a few things. I think I needed more detail in the instructions. For example, about how the test was to be divided into four separate sections.
ELPA21 will work with operational vendors to ensure that the TAM has all necessary information.
Directions Give us some "wiggle room" to adjust the directions without losing the reliability and validity.
ELPA21 will work with operational vendors to discuss the extent to which test administrators may deviate from or customize test directions, particularly for young students or recent arrivals, if students appear to be struggling to understand even the most basic test directions/administrative procedures.
Drag and Drop
Some of the first graders had difficulty dragging letters into boxes. The boxes and letters were quite small and it was pretty picky about where students had to drop the letter to get it to “stick” in that box. Dragging and dropping was also challenging with whole words, but not quite as bad as the individual letters.
ELPA21 recommends that educators ensure that all students have experience practicing the various technical aspects of the test by using the interactive demos and lesson plans prior to testing.
Headphones Headphones, microphones are frustrating when they don't work (Mixed Headset and Platform Issues)
ELPA21 encourages educators to test headphones and microphones on all devices that will be used for testing, prior to allowing students to begin a testing session.
Headphones Headphones dropping, freezing in the middle of questions or between segments. (Needing restart) (Chromebook)
ELPA21 encourages educators to test headphones and microphones on all devices that will be used for testing, prior to allowing students to begin a testing session. Also, please make sure to consult ELPA21's Technical Specs and Headset Specs documents to ensure compatibility.
Headphones Some of the headphones & microphones didn't work at first. Fortunately, we had a tech guy present at the 1st session to help with this!
ELPA21 encourages educators to test headphones and microphones on all devices that will be used for testing, prior to allowing students to begin a testing session. Also, please make sure to consult ELPA21's Technical Specs and Headset Specs documents to ensure compatibility.
Headphones More specific suggestions about what the minimum quality of headphones
ELPA21 encourages educators to test headphones and microphones on all devices that will be used for testing, prior to allowing students to begin a testing session. Also, please make sure to consult ELPA21's Technical Specs and Headset Specs documents to ensure compatibility.
Headphones
We had what we thought met the specs, but because our headphones and microphones were not integrated (which came out as a later recommendation), my students were not able to take the listening or the speaking test.
ELPA21 encourages educators to test headphones and microphones on all devices that will be used for testing, prior to allowing students to begin a testing session. Also, please make sure to consult ELPA21's Technical Specs and Headset Specs documents to ensure compatibility.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
28
Headphones However, the speaker (sic) part was problematic because the microphones would pick up the voices of surrounding students.
ELPA21 will work with states' operational vendors to include more information in our headset specs about directionality of microphones
Headphones
However, the earphones/ microphones caused so much frustration by not working, having to be reordered, static, stopping in the middle of sentences, etc. Also, when recording answers, the microphones picked up various students' voices at one time. The students were quite bothered by that interference.
ELPA21 will work with states' operational vendors to include more information in our headset specs about directionality of microphones
Headsets
We were not able to have more than a few students do the speaking at one time, as the microphones picked up all voices from around the room. We had to set students in far corners and have them face away from the class.
ELPA21 will work with states' operational vendors to include more information in our headset specs about directionality of microphones
iPad
On the iPads, students would have to select their answer many times before the picture would actually select the image. They would sit sometimes for over a minute trying to select an answer.
ELPA21 and states' operational vendors will explore this during user acceptance testing.
iPad
Out of the 2 days we set aside to administer the assessment, the companies servers did not work with our IOS devices. When some students tried to answer a question by pressing n the screen, it took a lot amount of pressure. If it did not highlight right away, students chose different answers.
ELPA21 and states' operational vendors will explore this during UAT.
iPad
We began testing on the iPad. It was a terrible experience. The app would freeze and the only solution was to force quit the app, then go back in, re‐log in and proceed.
ELPA21 and states' operational vendors will explore this issue during UAT.
Microphones Nowhere did any checklist say that a microphone was needed for each student or what kind of microphone.
This information was provided in several documents that were shared throughout the consortium.
Passwords Username and password should be simplified. Many students had trouble with uppercase/lowercase number combos.
ELPA21 and states will discuss with operational vendors whether passwords must be case specific. Also keep in mind similarities between characters (I and l, 0 and O, etc.) Consider not making UN/PW case sensitive, particularly for younger students.
Professional Development
Host the training online right before the event. States should discuss the feasibility of this type of training.
Professional Development
Annual refresher webinar would be helpful States should discuss the feasibility of this type of training.
Professional Development
A hands‐on training session for administration would be helpful.
States should discuss the feasibility of this type of training.
Professional Development
It was overwhelming to do it all on my own time however. I suggest the districts having a half day set to do ELPA 21 training because there are so many resources.
States should discuss the feasibility of this type of training.
Professional Development
A second training module for each level would be even better. Training modules were very good.
States can discuss the feasibility of developing additional training modules with their operational vendors.
Pre‐ID The Pre‐ID template was quite confusing States and operational vendors will discuss the burden of uploading student data at the local level. It may be possible for student data to be uploaded at the state level.
Proctor Password Proctor password was hard to locate States should work with their platform vendors to ensure the proctor password is easy to locate.
Proctor Password Proctor password was hard to locate ‐ needed unnecessarily
States should work with their platform vendors to ensure the proctor password is easy to locate.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
29
Scribe
Clearer scribing instructions (for multiple choice, dragging, and typing answers); for scribing need to consider issues by grade as K‐2 has paper and pencil version;
More detailed instructions for the use of scribes will be available in the operational version of the ELPA21 Accessibility and Accommodations Manual.
Scrolling
The scrolling bars are very thin and difficult to control. Students using windows mice were able to scroll up and down using the wheel on the mouse, but that isn’t an option for right‐to‐left scrolling and the Mac mice don’t have the wheel. Students struggled to get the scroll bar to move.
ELPA21 will work with states their platform vendors to improve the navigation tools for the operational assessment.
Scrolling Students needing to scroll; ELPA21 will work with states their platform vendors to improve the navigation tools for the operational assessment.
SIG Need a section on trouble shooting technology glitches, for example right click to reload pages.
States should discuss with operational vendors developing a "five most likely technical issues" doc, or appendix to the TAM. (Note: Keep this short and simple)
SIG
On Chromebooks we had a slight problem because you have to tell the computer that it is okay to use the microphone...if they clicked before they did that, they had to start over from the beginning. Not sure what to do about that but definitely should be addressed.
Vendors should add specifics to Chromebook guidance as needed.
Speaking Multiple re‐recording for speaking
ELPA21’s Implementation Manual will provide states and their vendors with the number of times a student may record a response. For students needing an accommodation, consult the Accessibility and Accommodations Manual.
Student Assignments
Assigning students to teachers won't work well for us. We have lots of students who have more than one ELL teacher who would need to access their results. Please find a way to allow us to choose not to assign students to specific teachers and just give as many teachers access as necessary at the school level.
States and vendors need to discuss roles in test administration to ensure the best design for the administrative platform, access to resources, and permissions.
Student Password The student user names and passwords were very confusing/long UPPER/lower, l/1, 0/O.
ELPA21 will discuss with operational vendors whether passwords must be case specific. Also keep in mind similarities between characters (I and l, 0 and O, etc.) Consider not making UN/PW case sensitive, particularly for younger students.
Student Tech Prep but for students with limited experience using a computer keyboard, it was very hard and took a lot longer than it should have. (Keyboarding General)
ELPA21 encourages educators to work with students on the interactive demos so that they become familiar with the technology required for the test, prior to testing.
TAM The "Test Administrators Manual" needs to include screen shots or information about what the students should be doing.
ELPA21 will discuss the feasibility of including (more) screenshots in states' TAMs with their operational vendors.
Test Design Took much longer than estimated (too long) Also, it took the students 150‐200 minutes to complete the entire test.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design
Students are self‐aware when speaking aloud, especially language learners, and don't like others to hear them. Even with headphones on, students can hear each other speak or at least worry others will hear. (reluctant to speak)
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design
There were a couple of test items that were tripped students up. One asked students to write three questions to ask a class visitor. Many students wanted to put all three questions in the first box, not realizing that two more boxes would follow (one for each question).
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
30
Test Design Have a paper pencil section for 2nd grader similar to 1st grade.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design The 1st grade finished each section in a class period. The 2nd grade took about 2 weeks of 25 minute sessions. I think it was too long.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design There need to be questions that are lower level for students who have just come to the country.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design
However, it was difficult for students to switch back and forth between the different question formats. They didn't always understand what they were being asked to do.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design During speaking, students were not as engaged as they appear to be with a tester.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design Some did not understand that if the question asked for several reasons they had to record the whole answer at once and could not pause in between reasons.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design The speaking part may be inaccurate because there is no human to make the student feel cozy and comfortable. The microphone is a monster to some of them.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design Many students were frustrated with the length and had problems answering questions.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design Some of questions' format was confusing to the kids. This was very stressful to them.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design I also found it odd that some students received the "record your name now" prompt in the speaking section while others did not.
Operational vendors need to QC to ensure that all students encounter the same standard prompts.
Test Design Recording their voices was especially hard for kindergarten or if the student spoke softly.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design Students were anxious about the fact that they could only record themselves twice on the speaking section.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design Reading, speaking and listening questions were included in all sections of the test???????
The design of the Field Test was intended to include some experimental items that may have seemed to be "out of domain" items. This will not be the structure/format of the operational ELPA21.
Test Design
My level 1 students had a hard time figuring out what the questions wanted them to do. Especially when they showed the exact same paragraph twice and they had to choose a sentence from the paragraph.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
31
Test Design
Writing‐ the text box was too large. Students felt like they needed to take up the entire space with random characters. ‐ Another issue that threw students was the question/reading responses that did not have anything to do on the page. The students have been trained to answer every question on every page, and when there wasn't anything to do; they froze and often asked for help. This posed a problem for the proctors to say "Do your best" as indicated.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design
Some of the answer sections have subcategories, which several students (both native and non‐native speakers) became confused by—they thought that the test wasn't recognizing that they had answered the question because they overlooked the number of unanswered questions at the top of the page. We would recommend removing the subcategories to have more clarity for the assessment.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Design
Some students and/or/as well as some questions didn't specify what section the student was on. Sometimes it said at the top: reading, writing, listening, speaking and sometimes not.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to the test design for the operational ELPA21.
Test Directions
It would be helpful to be provided with more scripted information to reply to students as you guide them through the test taking process. For example, one first grade student who is a newcomer and has very limited English skills was paralyzed by the speaking test. He just stared at the screen. As proctors we weren't sure if we could say statements like: "Just click next to keep moving through the test." If a student doesn't understand anything are we allowed to coach them to skip questions in order to complete the test? And if so, what exact words should we use?
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to adjusting test directions. Our teams will also work with operational vendors to discuss the feasibility of allowing proctors some flexibility with regard to interpreting the written test directions to assist students with very low levels of comprehension (students who have little or no idea of how to even begin taking the test).
Test Directions A script with more simplified language for K‐1 children explaining the test tools would be helpful.
Questar and ELPA21's IAD and ADS TMTs will be reviewing data from the Field Test and will determine appropriate next steps with regard to adjusting test directions.
Test Directions
Students would have liked a pause button to be able to pause recorded instructions and then restart the recording from the spot where it was paused. I think that currently the recording starts again from the very beginning.
ELPA21 will relay this information to states' operational test vendors.
Test Platform Recording was challenging (student logistics of speaking test).
ELPA21 strongly encourages educators to provide ample opportunities for students to practice things like the recording procedure, using the demosInteractive Demo, prior to testing.
Test Platform Being locked out mid question/Having been kicked out of the test and having to log back on‐ frequent kick outs same student.
ELPA21 will work with states and operational test vendors to ensure that students are not inexplicably kicked out of the test.
Test Platform
Almost all of our students were kicked off the Speaking section repeatedly. It was extremely frustrating for them and for those of us who had to type in the alphanumeric usernames and passwords multiple times.
This was an issue unique to the Field Test and was resolved in the early stages of the test. In the event that this happens in the future, ELPA21 encourages test administrators to contact the appropriate help desk to assist with these types of issues.
Test Platform
The software doesn’t let you turn the sound down until the student is all of the way in the test and requires that it be adjusted for each test. Could the volume be set at a more medium level initially so that we don’t damage students’ ears? Upsetting to students
ELPA21 will explore this issue with operational vendors.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
32
Test Platform The audio section was a problem. I had to keep reloading to get my students to hear sound.
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform On the old ELPA the kids could flag a problem and go on and then go back and check. The Review this time was not accurate seem to miss flag items answered or not.
This is a feature of the ELPA21 assessment. ELPA21 will confirm that this is a part of the Interactive Demo.
Test Platform Loading issues taking too long. States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform
Some of the videos on the test didn't work so we didn't answer the questions that were supposed to go with it since the students didn't have the information. Also some of the audio clips of directions would skip every now and then.
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform The system wouldn't load and we would have to change computers.
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform Could we know how close the students are to being done in a section (for example: question 21/50)?
States can explore this option with operational vendors.
Test Platform On a writing question, could the curser automatically be up and running in the area to type instead of having to click there?
States can explore this option with operational vendors.
Test Platform For us, the site was very "buggy." Students had tests freeze, kick them out, and fail to submit even when test was completed.
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform Sometimes it would not let me log someone in if it has been used earlier. I had to restart the computers.
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform We had a few issues with students getting kicked off in the middle of the test, or even after just logging in (not after 20 minutes like we were told).
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing. Operational vendors will need to discuss with states what the optimal amount of time is for a time‐out, both for students and administrators.
Test Platform The site occasionally was buffering as students transitioned from one unit to the next.
States should discuss bandwidth requirements with operational vendors prior to testing.
Test Platform
The tools, like note taking, seemed to help. The note taking tool would be better if the notes followed on all screens that dealt with a particular reading section. Having to flip between sections and copy and paste their notes was tedious.
ELPA21 will work with operational vendors to ensure that there is information outlining functionality of tools and that that information is accessible to both test administers and students prior to and during testing.
Test Platform
The speaking portion was very problematic. We were able to restart students and get some to eventually work, but for some students it took up to 7 times before it would work.
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform The stop button NOT stopping for recording purposes. States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform A couple of students were confused because a question was marked as unanswered on the summary page , even though they had indeed answered it.
ELPA21 will work with operational vendors to ensure that there is information about how answers will be recorded/appear on the final review page of a student test, in the TAM.
Test Platform The review didn't accept students' responses each time; therefore, they had to return to items more times than should have been necessary.
ELPA21 will work with operational vendors to ensure that there is information about how answers will be recorded/appear on the final review page of a student test, in the TAM. ELPA21 will
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
33
also work with QAI specifically to ensure that their system correctly displays answered items.
Test Platform
*Several tests had unresponsive “next arrows” and no “review” button, so would have only the choice of going “back” or “pause”. * Two students were kicked out when they clicked on the “next” arrow even though the test was not finished.
Operational vendors need to ensure that all platform features are fully functioning.
Test Platform Several microphone buttons were frozen and did not allow students to record *One screen missing “continue” button on the screen to access the test.
Operational vendors need to ensure that all platform features are fully functioning.
Test Platform
When students (8 times) finished one subtest and tried to begin the next, the loading circle would continue on for several minutes and eventually freeze. Had to use task manager to get out. Then, the same computer would work once they signed in again.
States should discuss bandwidth requirements with operational vendors prior to testing. Test administrators should also make sure to test computers prior to testing.
Test Platform
2/26 *In the listening subtest a question had three pictures to choose an answer from. Student selected one by clicking and there was a dark blue border around it. Then, when the student clicked the “review” button, it said he had 1 unanswered question. He clicked on it and it brought him to the question described above. He again selected the picture, it again had a dark blue border indicated it was selected. However, when he clicked “review” once again it brought him back to that question as “unanswered”. He repeated the cycle a couple of times, I finally just had him submit it the way it was. 3/2 *Loading circle frozen when student was trying to begin a test. I pressed the ESC key, and then suddenly a question loaded and sounded like it was in the middle of presenting the question on the screen. *
Operational vendors need to ensure that all platform features are fully functioning.
Test Platform
6 students in a group of 12 that had all been on the same testing schedule (i.e. had NOT paused their test for over 20 minutes) could not get in to test. They were prompted for a proctor password (this is only supposed to happen if they had paused for over 20 minutes). I tried my login password, it didn’t work. I looked all over the website for where Debbie’s email said I could find it “at the bottom of the home page on the admin site.” Nothing. I called Questar and was told the same info, told her there was nothing on my screen like that, apparently it is on the testing coordinator’s screen (not helpful to a test administrator). She told me the password.
ELPA21 will work with states and their operational platform vendors to ensure that the proctor password feature is clearly explained in test administration materials.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
34
Test Platform
3/4 *At the beginning of the test session, after being prompted once again to enter the proctor password for several students who had NOT been paused for more than 20 minutes, the test reloaded questions students reported that they had already answered. Sure enough, I could see that on the screen an answer was already selected or if it was a speaking question, the student clicked the play button and I could hear that they already recorded an answer. Told students to click the next button until they came to a question they had not answered yet. Sometimes, I was prompted for the proctor password again, for the same student after they completed and submitted, but NOT paused, one of the subtests.
ELPA21 will work with operational vendors to ensure that administrators and students fully understand how the review screen feature is intended to work, and that the platform is functioning accordingly.
Test Platform
We experienced some difficulties with the speaking and listening tests because these would kick the students out after timing out due to an issue with the recording page. Once the students logged back in there weren't any issues with using the recording tool. I would recommend that this be checked if there is a script error of some sort that could be causing this issue.
This was an issue identified during the early stages of the Field Test and was resolved. In preparation for the operational assessment ELPA21 will work with operational platform vendors to ensure that the recording function is working properly.
Test Platform
Many computers would go through like everything was fine, but instead of showing a loading bar on the bottom and then having the "begin" button appear, it would get stuck on the beginning screen and never load. I had to switch students' computers several times.
It is possible that this was a bandwidth issue. ELPA21 encourages test administrators to have IT staff on hand to assist with these types of issues.
Test Platform Students have to be reminded of touching the screen Very very softly. Otherwise the screen would freeze.
ELPA21 is not aware of this issue. We encourage states and operational platform vendors to explore this during UAT prior to operational testing.
Test Platform
One of my students began the reading segment one day and each day he attempted to login thereafter, he was unable to. He would get to a screen where the "Loading" symbol perpetually ran. I believe this screen occurred after I had typed in the password for him to resume the assessment. He tried to login multiple days and this happened each time. He was unable to finish this assessment because of this issue.
The root of this issue is not clear. ELPA21 will work with states and operational vendors to QA all test platforms prior to student testing.
Test Platform
I had a student doing the speaking domain and we kept getting an error code. We would go back and redo the question and maybe 1 more than get the error code again. Sometimes when we went back it would take us to the question the student had been on and sometimes to a question already answered.
This was an issue identified during the early stages of the Field Test and was resolved. In preparation for the operational assessment, ELPA21 will work with operational platform vendors to ensure that the recording function is working properly.
Test Platform My students had issues with submitting the test and then being told it was not complete.
The root of this issue is not clear. ELPA21 will work with states and operational vendors to QA all test platforms prior to student testing.
Test Platform
Yes.... The secured browser doesn't go into full screen mode on Mac OS X 10.6.8. The operating system is listed as support. This causes issue for students when they need to access the buttons at the bottom of the screen.
This is the only instance of this issue that was reported during the Field Test. ELPA21 will work with states and operational vendors to QA all test platforms prior to student testing.
Test Platform
If students did NOT finish a section of the test and log out, then save the work and provide capabilities for students to re‐log on and finish the section of the test they missed.
This is currently a feature of ELPA21.
Test Platform Our computers have network home sharing and it did not work with them‐ so figuring that out
The root of this issue is not clear. ELPA21 will work with states and operational vendors to QA all test platforms prior to student testing.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
35
Test Platform
Technology is awful!! Kids have to go back and forth to see the entire question and/or answers. (Too much scrolling/navigation) This is more of a technology test than a language test (Scrolling, Tabbing, navigating, hidden items, too large, too much white space) Can the dialogue adjust so it can all be read as the writing/question box to the right is increased/decreased in size? Can the left/right arrow boxes be larger? Can the scrolling bars be made thicker and easier to use?
ELPA21 is working with states and operational vendors to make improvements to the test platform(s) so that scrolling is reduced, navigation is simplified and the test is generally more user friendly, especially for our youngest students.
Test Tickets
And it took some time to figure out how to get the student tickets. Also I didn't think adding students (classes) was very well detailed. We had some trouble uploading our EMIS students also.
ELPA21 will work with operational vendors to ensure that all admin procedures are clearly described in the TAM.
Test Tickets We need to be able to print a ticket for one student or a small group of new students.
States should discuss options for printing test tickets on a one‐off basis with their operational platform vendors.
Test Time Provide more specific information about how long each test should take students to complete so that we can schedule efficiently
ELPA21 will work with states operational platform vendors to provide more detailed guidance around test time once the operational test forms have been designed.
Tools Display tools on projector or have handout key for students.
This has already been addressed.
Training Modules
To be honest, I found most of the training modules to be boring and confusing. I think they would be more effective if they were in more of a "movie" style rather than someone talking during a power point like presentation.
ELPA21 states will work with operational platform vendors to develop updated training modules for the first operational test administration.
Translated Directions Less is more. ELPA21 states will work with operational platform vendors to consolidate resources.
Translated Directions I was not aware of this resource ELPA21 states will work with operational platform vendors to ensure that all resources are accessible.
Translated Directions
We considered using the audio file, but it was clunky in terms of technology...playing the audio and run the test at the same time. Is it possible to have these directions playback from the testing interface? It seems like the administration would be more standardized if directions were automated.
This is under consideration for operational assessment but is likely cost‐prohibitive to implement.
Translated Directions
Do directions need to start without the browser up and running? It would be easier and faster if the proctor could just open the browser and then have the students begin the assessment.
ELPA21 states will work with operational platform vendors to explore this request.
Translated Directions Really didn't understand when and how to use the translations.
ELPA21 states will work with operational platform vendors to ensure that access to this resource is made apparent to all audiences.
Translated Directions The redundancies from section to section are glaring to the students. Once they know how one section functions they know how they all function.
ELPA21 will work with states and operational test vendors to review the test directions prior to the first operational administration.
Translated Directions We need translations in other languages such as Nepali and Burmese. I wish we also had it in Burmese and Karen.
If states need additional translations, it will be the responsibility of the state to work with their operational vendor to develop those translations.
Translated Directions Need translations for Interactive Demos ELPA21 states will look into this. It may be best to use the translated directions for the operational assessment during the Interactive Demo.
APPENDIX E: ELPA21 FIELD TEST PLATFORM ISSUES TRACKER
36
Translated Directions
Honestly, with the younger students, the easiest thing was to show them how to move the mouse, where to click to make it bigger or smaller, how to scroll down, etc. I showed them once or twice and then they could do it. They haven't learned computer navigation vocabulary in Spanish, so they didn't really understand those words. I think these would be more helpful with middle school and high school newcomers who have used a computer in their native language. Most of our K‐4 newcomers have never touched a computer until they get to school.
ELPA21 plans to work with state and operational platform vendors to develop a "quick guide to technology" which can be delivered prior to exposing students to the nteractive Demo and the operational test. This will vary depending on the age and experience of the student.
Translated Directions Students are from Mexico. Speaker is not. Little children do not recognize their home language when spoken by someone with a different variety of their language.
If states need additional translations, it will be the responsibility of the state to work with their operational vendor to develop those translations.
Translated Directions
If a language isn't represented and I need a translator/interpreter for a student, will I have the option of having a trained interpreter to provide that service live?
This is an issue of state policy and will need to be discussed at the state level.
Upload
That darned initial set up was a pain in the posterior! If you didn't have everything in your format it was a long process! I almost preferred the window that popped up to give us new students to add over the upload format.
ELPA21 is working with states to develop a simplified initial setup process.
Zoom Desire for a click zoom function; There are currently two magnification/zoom functions available. States will work with their operational vendors to develop any additional features for the operational assessment.
APPENDIX F: ELPA21 FIELD TEST ITEM ISSUES TRACKER
37
Item Issues and Next Steps
Category Comments/Issues Next steps
Art Among the teachers and students, would like more ethnic clothing.
This comment will be shared with our item development vendor as new items are developed.
Art BOYS Vs. GIRLS can the photos be better at showing a girl is a girl and a boy is a boy???
This comment will be shared with our item development vendor as new items are developed.
Art
For items like the bananas in the park item, can the item be set to load at full size rather than loaded at half screen with the expand arrows?
ELPA21 has adjusted the size of some artwork to limit the amount of scrolling required to view it.
Art
In all grade bands, shrink the artwork in the speaking domain test to fit the screen as long as it keeps the artwork readable; otherwise, use the top‐bottom scrolling option [and make sure the item demo site includes practice with magnifying the art work in the lesson plan; use thumbnails in the operational test]
This comment will be shared with our operational platform vendors as they come on board and prepare for the first operational administration.
Art If putting more diversity, then include more from Asia This comment will be shared with our item development vendor as new items are developed.
Art
The pictures for speaking are very busy and a student could be distracted; therefore, incorrectly responding to the stimulus. Is there any way to make the art and question reflect the particular prompt? <Zoom>/close up?
This comment will be shared with our item development vendor as new items are developed.
Bias
I am concerned about the references to Live Theater in test items. Many students have no background knowledge that will allow them to speak or write on this topic. There are so many alternate topics that are far more accessible.
This comment will be shared with our item development vendor as new items are developed.
Conversation Listening: For the conversation section are the words written for the students to see?
No, these items are listening items and included the words for students to see would obscure the construct being measured.
Directions
Directions did not include information about experimental items. Re. Speaking items included in listening section, writing items included in reading, etc.
Our teams are working to develop guidelines for operational platform vendors so that when test directions are written they include information about experimental items.
Directions
It was difficult for proctors to sustain students’ attention when reading the directions. Could the tools be displayed on a projector as the proctor reads the directions? Alternately, could the directions be recorded and become part of the online test delivery? A quick online demo would be more effective than these long verbal directions. Also, would it be possible to develop a shortened version of the directions that could be read or delivered online between tests? Students don’t need to listen to the entire set of directions a second, third, or fourth time on the same day. Having the directions for the next test embedded online would allow students to advance through the tests at their own pace without waiting for the whole group to finish a test before moving on to the
ELPA21 recommends that vendors develop aprintable/projectable tools guide that can be displayed/used during testing. Additionally, ELPA21 recommends that vendors embed a brief introductory video into the test platform with a small set of sample questions so that students can warm up and practice using the test platform features immediately before starting the test. This is especially recommended for the Speaking test.
APPENDIX F: ELPA21 FIELD TEST ITEM ISSUES TRACKER
38
next. The flexibility would help make scheduling more efficient and reduce loss of instructional time.
End Test
There should be a notice when they finish last question that says END of STOP or THANK YOU, etc. Then have the opportunity to review for any missing.
Since this functionality already exists and was enabled for the Field Test, further information would be needed to act on this feedback.
Follow Directions
On follow directions, if it's click on an area that is not a key or a distracter, will they know that no answer has registered?
Yes, students will be alerted that they have not fully answered the question when viewing the review screen. They will then be given the opportunity to review and answer any questions that were previously missed or skipped.
Font Size We need to specify font size in artwork and in charts. That is not in the font style guide. ELPA21 has added this to the style guide.
Graph Reading Some concern about need for graph reading skills needed in Q1 of every graph set.
This comment will be shared with our item development vendor as new items are developed.
DemosInteractive Demo
Didn't match test; needs subsections (domains) like the real test; missing some interaction types.
This comment will be shared with our operational test vendors as new demosInteractive Demo are developed for the operational assessment.
DemosInteractive Demo More items to allow more practice.
This comment will be shared with our operational test vendors as new demosInteractive Demo are developed for the operational assessment.
DemosInteractive Demo Missing example of expanding side by side panes.
This comment will be shared with our operational test vendors as new demosInteractive Demo are developed for the operational assessment.
DemosInteractive Demo
A level too high for our limited English students "discouraged lower level students"
Since the 2015 administration was a Field Test, students received randomly assigned test forms that may not have been representative of the actual summative assessment.
DemosInteractive Demo
Interactive tutorial at beginning of actual assessment would be helpful; quick screen cast of tools and practice opportunity
This comment will be shared with our operational test vendors as new demosInteractive Demo are developed for the operational assessment. Additionally, we are working to embed a brief introductory video into the test platform with a small set of sample questions so that students can warm up and practice using the test platform features immediately before starting the test.
Item Levels
Not enough items are low enough to target level 2 learners. Reading passages and vocabulary are level 3 at minimum for most items.
This comment will be shared with our item development vendor as new items are developed.
Item Levels Not enough level 5 items needed to exit students This comment will be shared with our item development vendor as new items are developed.
Listening Add evaluative or critique qualities to some of the questions about the debates in listening
This comment will be shared with our item development vendor as new items are developed.
APPENDIX F: ELPA21 FIELD TEST ITEM ISSUES TRACKER
39
Listening
• Students replayed the prompt numerous times, at least 3‐4 times • Caused time to run out in allotted testing time • Teachers worried that multiple listening will cause a score not reflective of what students are expected to be able to do in a classroom setting • Teachers felt hearing prompt twice was appropriate as students in a classroom could ask for a repeat of directions
This comment will be shared with our item development vendor as new items are developed. It is also important to note that since this test is not timed, students should not be restricted to a certain amount of time for testing. Our experts are also working to determine the number of times students will be able to replay spoken directions based on Field Test data.
Listening
Listening Sample (understory?) could be answered without listening as it could be answered from picture and questions
Visual stimuli are intended as a support for listening skills, as this is a reflection of the way that students typically engage in classroom learning (i.e. a student's oral presentation will likely have drawings/photos that accompany it)
Memory and Recall
Long student presentation passages expect extensive recall & memory skills which will affect students' ability to demonstrate linguistic level & cognitive understanding
Item Field Test stats will inform any changes to this item type, if needed, and we will work with our item development vendor as new items are developed.
Multidimensionality
Student presentation speaking section assesses listening and/or reading (in addition to speaking) consider implications of this when selecting standards and sub claims measured
Student presentation speaking items were experimental, deliberately multidimensional. For scoring and standards setting, each item is tagged for the domain(s) it measures, as some items measure proficiency in more than one domain. This has all been accounted for in the item development process and will be carefully considered during scoring and standards setting.
Prompts
LA Presentation in speaking = the Reading Listening prompts should be identical. Students with low listening &/or reading skills should be given as much opportunity as possible to successfully understand the prompts. As it is, they have to try to comprehend 2 reports that are similar, but include different structures and vocabulary.
This was deliberate, in that students do not always read their presentations word‐for‐word.
Prompts
Student clicked on object in picture instead of on the word in the sentence. Maybe there needs to be a prompt that says “click on the sentence.”
This comment will be shared with our item development vendor as new items are developed.
Reading Reading passages begin at too difficult a level to measure growth.
This comment will be shared with our item development vendor as new items are developed.
Reading Reading: Where is the phonetic awareness assessment? This comment will be shared with our item development vendor as new items are developed.
Reading
There are no reading items that measure sentence ‐ level comprehension and below. Even vocabulary questions require understanding multiple sentences.
This comment will be shared with our item development vendor as new items are developed.
Recording
Strong inclination is to (1) require students to listen to their entire recordings before moving to the next item, AND (2) have a “low recording” warning that works correctly.
ELPA21's Item Development Team will discuss this issue to make a determination.
Recording Limiting recording opportunities to 2 will not allow students to perform at their best
This was a deliberate decision in terms of the construct; ELPA21 will check FT data to see how many students gave >1 recording and change if needed.
APPENDIX F: ELPA21 FIELD TEST ITEM ISSUES TRACKER
40
Response Boxes
There are a number of questions that say "now write 3 responses. Write your 1st response here" and students tend to write all of their responses on the 1st response page. Will want to bring this to the attention to those doing scoring so that they do not take off points for a student providing all 3 responses on 1 page, and leaving the second 2 responses blank. May also want to reconsider item format, and show all 3 response areas on the same page (with less space, since they are short responses) so that student's know up from that they will have additional spaces to provide their 2nd and 3rd responses.
ELPA21's Data Review panelsP will discuss this to determine if this is an issue. Readers are trained to score appropriately if it does.
Science Although we liked seeing science info text, we feel there should be more info text NOT related to science as well.
This comment will be shared with our item development vendor as new items are developed.
Scoring
On speaking assessment, if an example is provided and the student uses the example as all or part of their answer, we need to think about how rubric will address scoring. This is part of the scoring rubric and training.
Speaking
For speaking, the desired expectations of responses must be given in directions or prompts to match scoring rubric expectations This is part of the scoring rubric and training.
Speaking
Many speaking questions ask students to describe what they do with their friends. I've had students respond "I don't have any friends", could we add a follow up prompt to say "pretend you have a friend"?
This comment will be shared with our item development vendor as new items are developed.
Speaking Picture (Speaking) ‐ Students need to receive credit even if they repeat what is mentioned in set leader.
The scoring rubric and training address this point. It is handled according to the prompt requirements.
Speaking
Since speaking is being recorded and not done in person, when a student answering a question such as "Where are the boys?' with "the boys" are right there and pointing to them, how will the grader know that this is a correct answer?
They don’t. If the student does not speak the response, the graders cannot score it.
Speaking
Speaking test should include some very easy responses to easy prompts such as questions ‐‐What is the weather? ‐‐Where do you live?
This comment will be shared with our item development vendor as new items are developed.
Speaking
the fact that recordings were cut off after 59 seconds caused students to limit their responses. As a result, the speaking items seemed to be measuring reading and listening comprehension more than they were measuring speaking.
This is being investigated. In the operational test, there will not be a 60‐second time limit.
Speaking students need a reminder to use complete sentences in Speaking
This comment will be shared with our item development vendor as new items are developed.
Speech Speed
To what extent is the speed of the person talking a part of the listening construct? Has speed of speech been attended to in the development of listening items? Did the content review committees comment on this?
Speed is definitely part of the construct, at least potentially. Our approach on ELPA21 has been to record at a natural pace and intonation, with a bit of a bias towards making it maximally comprehensible. The content review committees did not comment on pace in particular, but they did hear the recorded audio samples did not raise any concerns about speed.
Spelling The spelling in word builder does not match the ELP Standards. Is spelling appropriate based on this?
The task type is not meant to assess spelling and so invented or phonetic spelling may be scored as correct.
APPENDIX F: ELPA21 FIELD TEST ITEM ISSUES TRACKER
41
Standards
Standard 2 level 1, 2‐ maybe add something about text complexity and that is what could be the difference between levels The next round of standards review/revision will address this.
Student Debate
Will overall directions for listening student debate section prepare students for the structure of the information presented, such as telling them they will be given the main topic of the debate, then they will be told what information to be listening for?
The introduction gives students the topic of the debate and tells them what to listen for.
Tablets
On tablets, the term “click” doesn’t really describe what students need to do. Is there a word other than “click” that we plan to use? In most cases we went with “choose” rather than “click” because that wording works regardless of platform. There are a few task types (e.g., Kindergarten follow instructions) where “choose” seemed too vague/not to work, so we left it as “click” which seemed to communicate better.
Our teams are investigating whether an alternate term can/should be used.
Test Design
Research in Test Design (Psychometrics) shows that using only 2 distractions (or having only 3 multiple choice options) gives test takers a much higher probability of guessing and getting the right answer. Research shows that adding another distracter (3 total) (or having 4 options) will greatly add to the reliability of the test. i.e. Please add 3 ‐ options, instead of having two options for grades K ‐ 2
The research and best practice vary on this topic, but fewer options do mean a higher chance of guessing. The item calibration procedure takes the probability of getting the answer correct by guessing into account. For young children, fewer options are often used in multiple choice items. Field Test analyses will add to our understanding of whether additional options should be added in future item development.
Test Design
Many items have same art and/or words or phrases. It is important the next vender assembles the test carefully to ensure students don't get similar items. Example with same art: Reading: fast car, Writing: the car is fast.
This issue was limited to the way items was presented in the Field Test forms and will not be encountered operationally.
Type ‐ Book Report Book report scenarios ‐‐ need enough time to read vs listen ‐ since they aren't identical Our teams will be revisiting this item type to address this issue.
Type ‐ Listen for Info
Listening for information ‐ students need the option to hear the answer choices or it is assessing reading standards as well.
While the task does require some reading, this is minimized by the use of easy words and by the fact that the students have heard the word in the presentation. Content review committees approved. Item stats and analyses will tell us more.
Type ‐ Short Correspondence
Short correspondence ‐ I would like to see answers to questions either accompanied by picture or only have picture
This comment will be shared with our item development vendor as new items are developed.
Type ‐ Tableau tableau Rubric not appropriate to capture level of proficiency. *Need more scoring levels.
While the task does require some reading, this is minimized by the use of easy words and by the fact that the students have heard the word in the presentation. Content review committees approved. Item stats and analyses will tell us more.
Vocabulary
P 32. speaking directions/narration: Use "classmate" rather than "partner" due to multi‐meaning context for more common, generalized term.
This comment will be shared with our item development vendor as new items are developed.
Vocabulary P. 39 refer to Bias Committees Sofa couch This comment will be shared with our item development vendor as new items are developed.
Word Bank
Word bank items ‐ if the student cannot read, they cannot use the word bank. It’s going to be really hard to score these – the student said “I see a girl grabbing a picture” but wrote “I se grol grafon a pechur.” It looks like
Readers are trained to read through phonetic spellings and score appropriately. The word bank is read aloud to students as well as printed.
APPENDIX F: ELPA21 FIELD TEST ITEM ISSUES TRACKER
42
gibberish but works phonetically. Automated text reader scoring will be very tough for these.
Word Builder
Word builder‐‐‐Word. Maybe you could say the sounds of each letter in the word and the child can drag the missing sound. Say whole word‐‐‐say sounds slowly.
This comment will be shared with our item development vendor as new items are developed.
Writing
The writing test as written is more of a listening test/spelling test than a writing test. There needs to be actual writing with paper and pencil.
A pencil and paper subtest was added to address Standards 3 and 10 at grades K and 1.
Writing
To assess primary writing the narrator should ask a question that the student can answer by arranging words from a word bank (word bank words should be interactive with sound)
This comment will be shared with our item development vendor as new items are developed.
Item Issues and Next Steps: Kindergarten
Category Comments/Issues Next steps
Foundational Skills Reading/Writing foundational skills are missing from the K level PLDs The PLDs are from the ELP Standards.
Item Type
"all my students got confused about was when it asked to restate a question....my students ended up answering the question. For example, the test would say, “Ask the question where did bill get those rocks." Instead of repeating the question, my students would answer it"
Field Test data will confirm how often this occurred; may want to add to tutorial
Reading Narration change to K‐R‐Read and match word? "Choose the picture that best matches the word."
"This comment will be shared with our item development vendor as new items are developed. "
Specs ‐ Reading
Changing specs ‐ Reading at the word level should be oral reading of the word then the student clicks on one of 3 options that are written words. Test K readiness, reading skills.
This comment will be shared with our item development vendor as new items are developed.
Standards
K word builder Standard 3 only applies at level 5 and Standard 10 applies at level 1. This is a huge disconnect. I don't think there is enough support for EL'S level 1 ‐ 3 on Standard 10.
This comment will be shared with our item development vendor as new items are developed.
Writing List letters choice in K writing in Alphabetical order. This comment will be shared with our item development vendor as new items are developed.
Item Issues and Next Steps: Grade 1
Category Comments/Issues Next steps
Item Type
Sentence builders too hard for 1st grade! Please start with a simple 3 word sentence and move to more difficult sentences.
This comment will be shared with our item development vendor as new items are developed.
Names
First Graders get hung up on names when reading…..Anya, Gwyn, Chun‐y, Rico, Zena, Juanna, Camila, Paola Names should be simpler & able to sound out.
This comment will be shared with our item development vendor as new items are developed.
Prompt
A 1st grade student had some uncertainty about the select the word in the sentence question type. I think that the first time, the student missed the audio prompt and there is nothing on the screen that indicates what needs to be done. Perhaps there is a way that we could make it more obvious that a word needs to be selected.
This will be shared with our operational platform vendors prior to the first operational administration.
APPENDIX F: ELPA21 FIELD TEST ITEM ISSUES TRACKER
43
Type Face Double spaces are needed after each period. This will help first graders read.
This comment will be shared with our item development vendor as new items are developed.
Writing 1st grade writing section needs to be re‐thought, none of these are assessing writing…Bummer…
Students in K and 1 will also receive a paper and pencil form for a portion of the writing test, to ensure that skills like letter generation and copying are accurately assessed.
Writing I do not like writing being computer tested.
Students in K and 1 will also receive a paper and pencil form for a portion of the writing test, to ensure that skills like letter generation and copying are accurately assessed.
Item Issues and Next Steps: Grade Band 2‐3
Category Comments/Issues Next steps
Keyboarding
Second and third graders don’t know how to type a paragraph. The teacher completed the demonstration lesson, but students didn’t remember how to get the cursor in the box, how to make a capital letter, how to put a space between words, etc. Older students who had not been in school in the US for very long were also lacking in keyboarding skills.
Whether students at this level have the necessary skills will be one of the factors ELPA21 reviews during data analysis.
Listening is IAD TMT sure they want auto‐play for listening stimulus? We will review comments and data from Field Test to determine if a change is needed.
Standards (error)
standard 8 level 2‐‐‐>cut and paste error The bullet should read . Determine the meaning of frequently occurring words, phrases, and expressions
We will evaluate this issue during the next round of ELP standards review.
KeyboardingKeyboarding
students didn't persist due to difficulty keyboarding (2nd graders)
We will review comments and data from Field Test to determine if a change is needed.
Item Issues and Next Steps: Grade Band 4‐5
Category Comments/Issues Next steps
Test DifficultyTest Difficulty
with the other test (Iowa), ELL could stop the test if they were newcomers…they had to continue even though they were frustrated and 'felt stupid'
ELPA21 is hoping to move towards an adaptive algorithm in the coming years.
Specific ItemSpecific Item
Stay away from random/abstract photo prompts (i.e. the mixed‐up market) It did not address compare/contrast vocabulary. Better comparison would be to have an outdoor market and a supermarket
This comment will be shared with our item development vendor as new items are developed.
Item Issues and Next Steps: Grade Band 6‐8
Category Comments/Issues Next steps
Student EngagementStudent Engagement
some reading passages are too long and will not hold students' attention
This comment will be shared with our item development vendor as new items are developed.
APPENDIX G: ELPA21 PLATFORM AND SYSTEM TRIAL SURVEY QUESTIONS
44
ELPA21 Platform and System Trial Survey Questions
1. Test Administration Manual. For all of the following, please indicate your level of agreement with the
following statements from strongly agree (4) to strongly disagree (1).
a. The Test Administration Manual is written in a manner that Test Administrators will easily
understand the steps for test administration.
b. The log in process for the Interactive Demo was easy to follow.
c. The Testing Directors in the Test Administration Manual were well organized.
d. The Student Test log in process for the test was easy to follow.
e. The Trouble Shooting Tips in the Test Administration Manual include enough information to aid
Test Administrators on the day of the test.
f. If you disagree, please provide specific details to support your rating.
2. Test Coordinators Manual. For all of the following, please indicate your level of agreement with the
following statements from strongly agree (4) to strongly disagree (1).
a. The instructions to set up and configure Questar’s Assessment System in the Questar Setup and
Installation Guide were clear and easily followed.
b. The Home Page of the Admin website provided enough information about my school’s online
administrations to allow me to prepare for the administration.
c. The process for printing student Login Tickets was successful and the instructions for adding or
editing Examiners and Students are clear.
d. Please provide your feedback on the resources available in the Additional Resources page of the
Test Coordinator’s Manual.
3. Accessibility and Accommodations Manual. For all of the following, please indicate your level of
agreement with the following statements from strongly agree (4) to strongly disagree (1).
a. The Accessibility and Accommodations Manual provides sufficient details for district personnel
to prepare for and implement ELPA21 assessments.
b. District personnel will have a thorough understanding of Embedded and Non‐embedded
Universal Features after reviewing the Accessibility and Accommodations Manual.
c. District personnel will have a thorough understanding of Embedded and Non‐embedded
Designated Features after reviewing the Accessibility and Accommodations Manual.
d. District personnel will have a thorough understanding of Embedded and Non‐embedded
Accommodations after reviewing the Accessibility and Accommodations Manual.
e. Appendix A is a helpful summary of the content in the Accessibility and Accommodations
Manual.
f. If you disagree, please provide specific details to support your rating.
4. Perspectives. For all of the following, please indicate your level of agreement with the following
statements from strongly agree (4) to strongly disagree (1).
a. My students would easily understand the steps said to them to log into the test.
b. My students would easily understand the directions to use the tools in the test.
APPENDIX G: ELPA21 PLATFORM AND SYSTEM TRIAL SURVEY QUESTIONS
45
c. I experienced to technical problems during the testing process.
d. If applicable, please describe your experience when you called the Customer Support line.
e. If you disagree, please provide specific details to support your rating.
5. What type of operating system (OS) did you use for the trial?
6. Please indicate your level of satisfaction with the quality and thoroughness of the modules for very
satisfied (4) to very dissatisfied (1).
a. Platform and Test Overview
b. Trial Orientation
c. Testing Lab Management
d. Workstation Preparation
e. Student Testing Sessions
f. Accessibility and Accommodations
g. Troubleshooting
h. Student Testing Experience
i. If you were not satisfied, please provide specific details to support your rating.
7. Do you have general feedback or comments regarding your experience with the Questar Assessment
System that you would like to share with ELPA21 and Questar Assessment, Inc.?
8. If you consent to having your comments used for ELPA21 communications, please click “yes” and
provide your first and last name and your email address.
APPENDIX H: ELPA21 FIELD TEST SURVEY QUESTIONS
46
ELPA21 Field Test Survey Questions
1. Please select your state:
□ Arkansas
□ Kansas
□ Iowa
□ Ohio
□ Louisiana
□ Nebraska
□ Oregon
□ Washington
□ West Virginia
□ Other (free space to type response)
2. Please indicate which grade(s) you worked with during the ELPA21 Field Test. (select all that apply)
□ Kindergarten
□ 1st Grade
□ Grades 2‐3
□ Grades 4‐5
□ Grades 6‐8
□ Grades 9‐12
3. What was the student proctor ratio you found most conducive to testing?
4. Did you have any comments regarding the student proctor ratio?
Training materials
Please indicate which resources you used in preparing for the Field Test:
5. Pre‐ID templates and instructions [y/n, if yes (see below)]
a. The Pre‐ID Sample Template, Pre‐ID File Format and Pre‐ID Upload Quick Reference Guide were
helpful in preparing my school’s Pre‐ID File. [strongly agree, agree, disagree, strongly disagree,
comment]
6. The Interactive Demos [y/n, if yes (see below)]
a. I was able to access the Interactive Demos modules prior to testing. [y/n, comment]
i. (If yes) The Interactive Demos were a valuable tool for my students to practice with.
[strongly agree, agree, disagree, strongly disagree, comment]
ii. (If yes) Approximately _______ percent of my students practiced with the ELPA21
Interactive Demos before taking the Field Test. [insert percentage]
7. Interactive Demo Lesson Plans [y/n, if yes (see below)]
a. (If yes) The lesson plans that accompanied the Interactive Demos were helpful in preparing
my students for the Field Test [strongly agree, agree, disagree, strongly disagree, comment]
APPENDIX H: ELPA21 FIELD TEST SURVEY QUESTIONS
47
8. Test Coordinators Manual [y/n, if yes (see below)]
a. The Test Coordinators Manual is written in a manner that provided valuable information
and was easy to follow. [strongly agree, agree, disagree, strongly disagree, comment]
9. Test Administrators Manual [y/n, if yes (see below)]
a. The Test Administrators Manual is written in a manner that provided valuable information
and was easy to follow. [strongly agree, agree, disagree, strongly disagree, comment]
10. Accessibility and Accommodations Manual [y/n, if yes (see below)]
a. The Accessibility and Accommodations Manual is written in a manner that provided valuable
information and was easy to follow. [strongly agree, agree, disagree, strongly disagree,
comment]
b. District personnel will have a thorough understanding of Embedded and Non‐embedded
Universal Features after reviewing the Accessibility and Accommodations Manual. [strongly
agree, agree, disagree, strongly disagree, comment]
c. District personnel will have a thorough understanding of Embedded and Non‐embedded
Designated Features after reviewing the Accessibility and Accommodations Manual.
[strongly agree, agree, disagree, strongly disagree, comment]
d. District personnel will have a thorough understanding of Non‐embedded Accommodations
after reviewing the Accessibility and Accommodations Manual. [strongly agree, agree,
disagree, strongly disagree, comment]
e. Are there other features that are not currently supported that your students would need for
the operational assessment? [y/n, comment]
f. Did the universal features available during the test meet students’ needs? [open response]
g. Did the designated features available during the test meet students’ needs? [open response]
h. Did the accommodations available during the test meet students’ needs? [open response]
i. Did any students seem disadvantaged by the test? [open response]
11. The Setup and Installation Guide [y/n, if yes (see below)]
a. The Setup and Installation Guide is written in a manner that provided valuable information
and was easy to follow. [strongly agree, agree, disagree, strongly disagree, comment]
12. The Training Modules
a. Field Test Overview [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
b. Trial Orientation [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
c. Student Testing Experience [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
APPENDIX H: ELPA21 FIELD TEST SURVEY QUESTIONS
48
d. Student Testing Sessions [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
e. Testing Lab Management [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
f. Accessibility and Accommodations Tools [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
g. Workstation Preparation [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
h. Troubleshooting [y/n, if yes (see below)]
i. This training module provided valuable information and was easy to follow.
[strongly agree, agree, disagree, strongly disagree, comment]
13. Directions for Administration Scripts and Audio Files (11 native language translations available)
[y/n, if yes (see below)]
a. If yes, which native language translation did you use with your students? [open response]
i. Did you use the scripts or the audio files? [scripts, audio files, both]
ii. Was this an effective tool for your students? [y/no, comment]
14. Were you able to access and maneuver within the ELPA21 Field Test Administration Site
successfully? [Yes – easily, Yes – with some difficulty, Yes – with much difficulty, No, comments]
Test items
15. Based on what you observed in the Interactive Demos, how do you feel ELPA21’s test items
compare to your old ELP assessment?
a. Measurement of English Language Proficiency: [more accurate, about the same, less
accurate, n/a, comment]
b. Difficulty: [harder, about the same, easier, n/a, comment]
c. Engagement: [more engaging, about the same, less engaging, n/a, comment]
d. Technology: [enhances the quality of the test questions, about the same, distracts from the
quality of the test questions, n/a, comment]
Customer support
16. Did you contact the Questar Help Desk at any point before or during the Field Test? [yes, no,
comment]
17. If applicable, please describe your experience when you called the Customer Support line.
[comment]
APPENDIX H: ELPA21 FIELD TEST SURVEY QUESTIONS
49
Communications
18. The emails and announcements leading up the Field Test provided me with timely and adequate
information to prepare for the Field Test. [strongly agree, agree, disagree, strongly disagree,
comment]
19. During the Field Test, I visited elpa21.org to learn about the test, find guidance, and/or resources.
[y/n, if yes see below]
20. Please mark any of the following materials you or your school used during the Field Test:
a. ELPA21 State/District/School Test Administrators' Quick Start Checklists [checkbox]
b. Field Test Administration and Technical Manuals overview [checkbox]
c. Field Test and Platform and System Trial Training Modules Overview [checkbox]
d. ELPA21 Field Test FAQ [checkbox]
e. ELPA21 calendar of events [checkbox]
f. ELPA21 Hardware Specifications [checkbox]
g. ELPA21 Headset Specifications [checkbox]
h. Additional Headset Information [checkbox]
i. ELPA21 Headset Kits: Tips and Tricks [checkbox]
j. ELPA21 Field Test v. Platform and System Trial Chart [checkbox]
21. During the Field Test, I visited my SEA’s website for resources and guidance on the Field Test. [y/n,
comment]
22. Do you have any suggestions or ideas on how ELPA21 can better share resources and
announcements to schools?
Role Specific Questions
Please indicate your role during the ELPA21 Field Test. (select all that apply)
1. District Test Coordinator
2. School Test Coordinator
3. Test Administrator
The questions that follow will pop out based on the selection of role, above.
Test Administrator
Student experience
1. My class experienced ________ technical problems during the testing process. [no, limited, some, many,
comment]
2. Students understood the test directions and were able to successfully navigate the platform. strongly
agree, agree, disagree, strongly disagree, comment]
3. Did it appear that students could find and navigate the tools easily? [y/n/comment]
APPENDIX H: ELPA21 FIELD TEST SURVEY QUESTIONS
50
4. What percent of students were able to complete the test within the estimated timeframe? ________
[insert percent]
Technology preparedness/computer‐based delivery platform
1. Based on your observations during testing, was there a particular issue that students appeared to
struggle with? [y*/n/*if yes, what, comment]
2. Of the students you observed, about how many seemed to have significant difficulty with the computer‐
based test delivery? ________ [insert percent]
Accessibility and Accommodations
1. Were accessibility features of the test sufficient for students with disabilities? [y/n/comment]
2. Did any one group of students seem to be disadvantaged by the test? [y/n/comment]
3. Do you have any suggestions for ELPA21?
District/School Test Coordinator
Test Registration System and Administration Site
1. Did you feel prepared and able to support test administrators during testing? [very prepared, fairly
prepared, non‐prepared, comment]
2. Does teacher feedback indicate that all administration processes were followed?
3. To your knowledge, what percent of students completed the practice test prior to taking the Field Test?
________
4. Do you have any suggestions for ELPA21?
Consent to share survey feedback publically
ELPA21 will be compiling the feedback received via this survey to better understand how we can improve our
assessment system for the future. In an effort to provide the public with an accurate picture of how the ELPA21
system is working in the field, we would like your permission to include any of the above responses in ELPA21’s
public facing materials (our website, newsletter, etc.) Only your state and role (district test coordinator, school
test coordinator, test administrator) would be indicated and all names will be redacted if comments are chosen
to be used by ELPA21 in any way.
□ I consent to having my comments used for ELPA21 communications.