Before Administering the CLAS-E Writing Assessment 5
Required Staff Training 5
Certification and Annual Recertification 5
Required Test Materials 6
When to use the BEST Plus, when to use the TABE CLAS-E Writing Assessment 6
Student Placement and When to Administer 7
Student Primary Assessment Area May Change in Fiscal Year 7
Programs Enrolling Students after April 1 May Receive Credit for Post Tests and Learning Gains, under Certain Conditions 7
Test Accommodations 7
Pre- and Post-Testing Interval Varies by Intensity of Class Hours 8
When to Alternate CLAS-E Test Levels and Forms 8
Co-enrolled Learners 8
Test Security 10
During CLAS-E Administration 10
Who May Administer 10
Use of the CLAS-E Locator 10
Use the Statewide SABES developed CLAS-E Locator Answer Booklet and SABES-developed Writing Assessment Answer Sheets Only 10
Following Test Administration Procedures and the Time Limits of the Test 10
Using Assessment Reports for Tracking When to Post-Test 10
Test Conditions 11
After Administering the CLAS-E Writing Assessment 11
Scoring the CLAS-E Expository Writing Tests 11
Programs are Strongly Encouraged to use the Guidelines for Retesting with TABE CLAS-E Writing Assessment 12
TABE CLAS-E Writing Assessment Retesting Guidelines Chart 13
Recalibrate each Time you Score 14
CLAS-E Scoring Monitor 14
After Scoring Tests, Track Scoring Consistency 14
Score Reporting 14
Exit Criteria for NRS Advanced ESL Level Students 14
TABE CLAS-E Writing Scale Scores for NRS Educational Functioning Levels Chart 15
TABE CLAS-E Writing Scores’ Correlation to NRS (National Reporting System) EFL (Educational Functioning Levels) and SPL (Student Performance Levels) Chart 15
Measuring Learner Gains 15
Copying over CLAS-E Scores to the Next Fiscal Year 16
The Massachusetts Department of Elementary and Secondary Education (ESE) is mandated by the US Department of Education to use valid and reliable assessments to report students’ completion of educational functioning levels. ACLS requires programs use four ABE standardized assessments, depending on the classes they offer: the Massachusetts Adult Proficiency Test (MAPT), the TABE Forms 9/10, Best Plus, and the TABE CLAS-E.
Massachusetts’s process for using required assessments for measuring learning gains: Learning gains are based on the first test and the last test given in a fiscal year
Class Placement (after intake, using any assessment other than MAPT)
Pre-test (within 2-4 weeks of class placement)
Formative assessment using teacher-made or other assessments during class to determine if students mastered what was taught (ongoing)
Optional Mid-year Test (after 65 hours of instruction; test no more than three times per year)
Post-test (after 65 hours of instruction, and before June 30; programs may not test more than three times per year)
All assessments administered must be entered in SMARTT.
Which Programs Use
ABE Reading and Math
TABE 9/10, Levels E, M, D, A
ABE Writing (Language subtest)
ABE Reading and Math
ABE Programs in Correctional Institutions, some Workplace without access to computers
For assessments to be accurate, they must be administered and scored according to test developers’ instructions. If staff make what may seem like small changes to test administration or scoring (such as giving test directions differently or diverging from a test’s rubric when scoring), test scores lose their accuracy. These changes affect the instruction learners may need and slows them down in achieving their goals. It also adversely affects the accuracy of statewide scores that ACLS uses to base annual state projections of learning gains for US DOE.
Formative assessments are also crucial to instruction and learner gains. They include authentic, teacher-made, task-based activities and products, etc. They test various skills determined by the teacher’s learning objectives and should include content and skills from the Massachusetts ABE Curriculum Framework for ESOL and the College and Career Readiness Standards for Adult Education (CCRSAE). Formative assessments are important because they indicate what students learned (and did not learn) and they guide what to teach next. They also involve students in the learning process, which improves motivation, persistence, and retention.
Overview of the TABE CLAS-E Writing Assessment
The state’s assessment policy requires programs to use the TABE CLAS-E Writing Assessment for the following adult learners:
ESOL students (SPL 0-6) whose goal is to improve their writing skills
Primary instruction provided by the volunteer program English at Large
The state’s assessment policy requires programs to use the BEST Plus test (computer-adaptive version) for the following learners enrolled in ESOL classes:
ESOL students (SPL 0-6) whose goal is to improve their oral proficiency.
The state’s assessment policy requires programs to use the TABE CLAS-E Reading Assessment for the following learners:
ESOL students (SPL 0-6) whose goal is to improve their reading skills
The following program types may use either the TABE CLAS-E Writing or Reading Assessment or BEST Plus, depending on their learner’s goals:
Distance Learning (ESOL) Programs
Students enrolled in Pre-Literacy ESOL classes, Levels 1, 2 and 3
Before Administering the CLAS-E Writing Assessment
Required Staff Training
A minimum of two staff per program must be trained to administer and score the CLAS-E Writing Assessment before any testing of student can begin. Programs offering ESOL classes must maintain at least two CLAS-E test administrators at all times.
Trained staff may not train fellow staff members at their programs.
The training has been separated into two sessions—one for practitioners administering the CLAS-E, and one for those scoring the CLAS-E
Practitioners administering and scoring the test (or only scoring the test), must take both the Administering and the Scoring portions of the training, and be recertified annually (see below).
Certification and Annual Recertification
Following their training in test administration and scoring, participants will need to score a series of expository writing samples successfully before they receive a Competency status and are approved to score. Please note: no certificates will be given; program staff must retain their email notification. Practitioners receiving their Initial Certification in FY15 will not have to recertify until FY16. If trainees do not pass the initial certification, Joan Ford, SABES director of assessment professional development, will contact them to arrange for remediation. There is a similar process for annual recertification. Each year, practitioners who are currently certified to administer and score the CLAS-E Writing assessment will be notified in early February by ACLS that they recertification process will begin soon.
At the end of February, Joan Ford will email essay packets for each CLAS-E scorer to the scorer’s program Director. Program Directors must distribute the packets to their scorers as soon as they are received. CLAS-E Writing test administrators/scorers will have one month to score a set of sample writing assessments and mail or fax them to Joan Ford by the deadline identified in ACLS correspondence, usually by the end of March. Joan will inform test administrators of their status one month after the deadline for submission. If trainees do not pass, they will be contacted by Joan to receive remediation. The competency status will be valid for at least one year. Please note: no certificates will be given; program staff must retain their email notification.
Required Test Materials
The following Test materials may be purchased at http://www.ctb.com/ctb.com/control/childNodesViewAction?categoryId=1145&adjBrd=Y
CLAS-E Locator Test
CLAS-E Locator test Directions
CLAS-E Replacement Test Books, Forms A and B, Levels 1 through 4
CLAS-E Expository Writing Folios, Forms A and B, Levels 1 through 4
CLAS-E Test Directions for Forms A and B, Levels 1 through 4
CLAS-E Scoring Tables Book, Forms A/B
CLAS-E Writing Scoring Guide, Forms A/B
The following required test materials may be acquired from the CLAS-E Writing training or from Joan Ford, SABES Director of Assessment, at firstname.lastname@example.org:
Tips for Taking the TABE CLAS-E (for test-takers) and test administrator directions
Adapted CLAS-E Locator Test Answer Booklet for Statewide Use; use this Answer Sheet instead of the CTB McGraw-Hill CLAS-E Locator Test Answer Booklet
Adapted TABE CLAS-E Locator Test Directions
SABES-developed Answer Sheet for Levels 1-4
SABES-developed alternative directions for Answer Sheet for Levels 1-4
Supplementary Scoring Guide for CLAS-E Expository Writing
Reformatted “Notes to the Scorer” for use with CLAS-E Expository Writing
When to use the BEST Plus, when to use the TABE CLAS-E Writing Assessment
Student SPL Level
Assessment to Use
If an ESOL student is placed at SPL 0 or above:
Use the BEST Plus or the TABE Clas-E Reading or Writing Assessment
If an ESOL student’s goal is to improve his/her speaking and listening skills:
Use the BEST Plus
If an ESOL student’s goal is to improve his/her writing skills:
Use the TABE Clas-E Writing Assessment
If an ESOL student’s goal is to improve his/her reading skills:
Programs may use any placement tests they choose. The CLAS-E Writing Assessment may be given at intake, and may be given as both the placement and pre-test. It may also be administered in class any time within 2-4 weeks of class placement, but no later than 4 weeks.
Student Primary Assessment Area May Change in Fiscal Year
Either the CLAS-E Writing, CLAS-E Reading, or BEST Plus must be selected as the primary assessment to report educational gain for students in ESOL classes. Programs are required to administer only one of the three, but more than one assessment may be administered, with one test counting as the primary assessment area and entered into SMARTT.
A student’s Primary Assessment Area may change during the fiscal year and assistance form SABES SMARTT Tech Support is no longer needed.
Learners who are co-enrolled may have different primary assessment areas at the different programs where they are enrolled to maximize their time at each program. (See the section on co-enrolled learners for more information).
Programs Enrolling Students after April 1 May Receive Credit for Post Tests and Learning Gains, under Certain Conditions
Students who are enrolled in a program April 1 or after and who have 65 hours of attended instruction between April 1 and June 30 may take a pre- and post-test. The learner’s post-test will be added to the program’s percentage of pre-/post-tested learners. Any type of attended hours qualify, including rate-based class hours, non rate-base class hours, match hours, and distance learning hours. In addition, if students have attended 65 hours of instruction between April 1 and June 30 and make learning gains, these gains will be added to the program’s percentage of learner gains.
An adult learner with a disability must provide the ABE program with disability-related documentation if he/she requires accommodations. If a counselor or other program staff person determines through a screening that there is a strong possibility the learner has a learning disability or other disability, a formal assessment may be undertaken. A formal assessment of a learning disability must be administered by a licensed professional (e.g., psychologist, school psychologist, or psychiatrist) and is valid for 5 years from the date of the formal assessment. In some cases, a “licensed professional” may be a speech, vocational, physical, or occupational therapist with verification by a licensed medical doctor, psychiatrist, or psychologist. An IEP (Individual Education Plan) is not a documented formal diagnosis of a learning disability.
“Disability-related documentation” includes educational assessments, or an Individual Education Plan (IEP) developed by the public school system to document a person’s disability for an accommodation in the ABE program. For more information about modifications and accommodations relating to ABE instruction, please refer to pages 13-15 in the ACLS Disability Guidelines, at http://www.doe.mass.edu/acls/disability/default.html.
Any student may be given the large print version of the TABE CLAS-E Writing if needed. Contact Joan Ford at email@example.com for more information or to borrow it.
For any questions, concerns, or if staff have a student who should receive accommodations on the CLAS-E Writing Assessment, please contact April Zenisky-Laguilles at firstname.lastname@example.org.
Pre- and Post-Testing Interval Varies by Intensity of Class Hours
Programs need to pre- and post-test enrolled students each fiscal year in order to capture student educational gain. The general recommendation is to post-test after an interval 65 hours of instruction. Programs may not test learners more than three times in a fiscal year. The test administered for the pre-test must remain the same for the post-test (e.g., one may not pre-test with the CLAS-E Writing and post-test with the CLAS-E Reading or the BEST Plus).
When to Alternate CLAS-E Test Levels and Forms
The CLAS-E Interview Part B and the Locator Test Part 1 must be used for the first CLAS-E Writing test given to learners. The Locator does not need to be given on subsequent tests.
Different levels of the CLAS-E may be given for initial, optional mid-year, and post-testing. For example, a learner may be pre-tested at level 2, and post-tested at level 3. All levels of CLAS-E are calibrated on the same scale, so results may be compared across levels.
When testing students at mid-year (optional) and/or for the end-of-year (post-test), alternate test forms must be given so that no student gets the same test form twice in a row. It would be easy for learners to remember the form from one test administration to the next.
If a student is functioning at the same level, switch Forms (e.g., Level 3, Form A to Level 3 Form B);
if a student has shown strong progress in class, move to the test at the next level with the same form (e.g., Level 3 to 4, Form A).
It is permissible to pre-test with Form A, administer a mid-year optional test with Form B, and post-test with Form A within a fiscal year.
Assessment information for learners who are co-enrolled or transferred will appear in SMARTT so that all programs involved with those students can use it.
A student’s required assessment (e.g., CLAS-E Writing or Reading, BEST Plus, MAPT, TABE 9/10) for students will appear in SMARTT at all sites where individual students are enrolled, regardless of the site where a student took the test. This feature makes it easier for programs to enroll these students and get them settled in class.
Assessments are color-coded in the SMARTT Student Assessment Screen to show where the student took the test.
The Primary Assessment Area may be set at all sites within the current fiscal year, and may vary between sites. (For example, a student may have a primary assessment area of math at one site, and reading at a site at which he/she is co-enrolled).
The copy icon will appear next to all May/June tests so that users may copy the last test taken from any site to their own site in the new Fiscal Year.
A link labeled “Other Site Assessments” located on the SMARTT “Welcome Screen” shows the list of co-enrolled students who took tests at other sites. There are two panels for current and exited students. Programs may view the list and drill down to the assessment screen to view tests for individual students. Programs may decide if they wish to set the student’s primary areas for this test. A column also shows if the primary area has been set. This feature allows programs to determine whether or not dually enrolled students need to have another assessment administered immediately.
To access the SMARTT Assessment Reports, click on the "Site" link on the left menu in SMARTT, and then click on the "Assessment Report Primary Set" option. The report will list all the tests, dates, forms, and next suggested testing date. Please note that the approximate next test date is based on four months, but you may test sooner in order to get a final post-test date for the year; the minimum is two months.
Cognos report of All Co-enrolled Students at a Site
In addition to the SMARTT Student Assessment Screen, there is also a new report in Cognos that allows a program to view all co-enrolled students in a site, sorted by their assessment (e.g., CLAS-E, BEST Plus.) This report will list the co-enrolled student’s name, other site(s) in which the student is co-enrolled, the site at which the test was taken, the date taken, the test name, test form and level, and the test score. This report can be helpful in that the SMARTT Student Assessment Screen lists students individually, while this Cognos report lists all co-enrolled students in a program, filtered by their assessment. To use this report, log on to Cognos http://www.doe.mass.edu/acls/smartt/ using the program’s or an individual Cognos account. Once on the ACLS Homepage, select the tab at the top labeled “Desk Review” and the report on c-enrolled students will be among the reports listed.
To receive a Cognos Account
Program staff may use their Program’s Cognos account since every site has one. Staff wishing to have a Cognos account may have their program director request one for him/her, as individual access is at the discretion of the Program Director. Requests for accounts need to be made two weeksprior to when the account is needed. Program directors, please send requests to Sriram Rajan email@example.com with the following information supplied:
Name of staff who will receive the account
Official Site Name (no acronyms or abbreviations)
Role at Program (specify Teacher, Counselor, Site Coordinator, or other)
The staff request for what they would like the Cognos User ID to be (e.g., 1st initial and last name, or some other personally meaningful identifier)
Users will receive their Cognos account information by email, along with a generic password. When users first log in, they must create their own password. Users need to write down the user ID and password and keep them in a handy place. Users and Programs must manage their own passwords and User IDs; ACLS will not have that information.
All CLAS-E Writing test materials and student test scores must be kept in a secure place. Do not file students’ CLAS-E Writing tests in their personal portfolio. Staff should, however, share, explain, and discuss scores with students so students can understand and follow their progress.
Staff and test administrators may not use CLAS-E Test Booklets or test items to prepare learners for their CLAS-E tests. The Massachusetts Department of Elementary and Secondary Education reserves the right to immediately terminate the program’s grant if any staff are found to be violating the CLAS-E assessment policy regarding test security.
During CLAS-E Administration
Who May Administer
Certified TABE CLAS-E Writing Test Administrators may not score their own students’ expository writing tests.
Use of the CLAS-E Locator
Learners taking the CLAS-E Writing test for the first time are required to take the CLAS-E Part B Interview/Screening Tool and the Locator. Learners do not have to use the Locator on subsequent Writing tests.
Use the Statewide SABES developed CLAS-E Locator Answer Booklet and SABES-developed Writing Assessment Answer Sheets Only
Programs must not use any program- or teacher-developed CLAS-E answer sheets because depending on the answer sheet, learners may gain an unfair advantage—or experience a disadvantage—from the program-developed answer sheet. The test publisher has granted permission for Massachusetts to use SABES-developed answer sheets for both the Locator and the Writing assessment, and programs must only use these two answer sheets in order to maintain consistency and a level playing field for all test-takers.
Following Test Administration Procedures and the Time Limits of the Test
Test administrators must administer and score the TABE CLAS-E Writing Assessment test exactly according to the CLAS-E trainings offered by SABES. Test administrators must not deviate from the script or test directions as they are presented in the training in any way.
Strictly adhere to the time limits given in test administration materials: the Part B Interview/Screening Tool is 5-10 minutes; Locator is 15 minutes; the Multiple Choice portion of the Writing test is 20 minutes, and the Expository Writing portion of the test is 27 minutes.
Using Assessment Reports for Tracking When to Post-Test
The SMARTT system generates assessment reports so that a program can track when to administer the next CLAS-E test1. The CLAS-E report will list the date administered for the previous fiscal year, either the copied May/June test or a new pre-test and optional mid-year test for the current fiscal year. The assessment reports may also be used to check which learners have not yet been post-tested near the end of the fiscal year. Any of the reports may be exported into Excel.
To access these reports, go into SMARTT, select the “Site” link on the left menu in SMARTT and select “Assessment Report Primary Set.”
To see CLAS-E Reading assessments taken by all learners at the program regardless of their Primary Assessment Area, instead select “CLAS-E Reading Report” (shown in green) below the “Assessment Report Primary Set.”
Any tests started before the end of the fiscal year but completed after June 30th will be counted as pre-tests in the next fiscal year.
The testing location must be quiet and comfortable so learners will not be distracted by their surroundings while taking the test. Learners may be tested in a quiet computer lab, empty classroom, office, or other space. Learners may not be tested in an occupied classroom or space where other students are in class or talking.
After Administering the CLAS-E Writing Assessment
Scoring the CLAS-E Expository Writing Tests
Certified CLAS-E Writing Test Administrators may score the Multiple Choice section of their own students’ tests. Programs may hire other non-staff certified CLAS-E scorers to help with pre- and post-testing at the program. To obtain a list of certified scorers, contact Joan Ford. Programs will contact and negotiate a rate of pay directly with the scorer.
Scorers must use the CLAS-E Expository Writing Rubric, Notes to the Scorer, Supplementary Scoring Guide (released September 2011) and benchmark writing samples each time they score. To ensure consistent CLAS-E Expository Writing Folio test scores statewide, all test administrators must achieve inter-rater reliability, so that all practitioners throughout the state are uniform in their scoring. All scorers need practice and require refresher work before any testing session to maintain their uniformity of scoring. The goal is for all scorers to calibrate themselves to the rubric and training materials, not to each other.
The rubric, Notes to the Scorer, Supplementary Scoring Guide, and benchmark writing samples in the TABE CLAS-E Writing Scoring Guide and the SABES training materials are the standards by which to score. When in doubt (e.g., if the benchmark writing samples do not appear to agree with the CLAS-E rubric), follow the rubric.
Classroom teachers may not score their own students’ tests.
Two scorers must score each examinee’s Expository Writing portion of the test. Scorers must not discuss their scores until they have finished scoring.
If the scores of two readers differ by one point on any of the five expository test items, these two scores should be averaged for that item (e.g., not rounded up). Once all five items have been scored, add all items and round up for the final total score.
For example, the following two scorers “A” and “B,” scored the following student’s
expository writing portion of the CLAS-E writing test:
Averaged when differ by more than 1 point
(5 / 2 = 2.5)
(3 / 2 = 1.5)
(5 / 2 = 2.5)
11.5 is rounded upat end to 12
If the scores of the two readers of the expository writing folio portion of the CLAS-E differ by more than one point, a third reader is needed to determine the Final Item Score. Follow the directions in the Writing Scoring Guide for using a third reader.
Programs may request that Joan Ford be a third reader, if she is available to do so. (Small programs may wish to collaborate among local programs, and mail or fax essays to one another, or hire a certified CLAS-E scorer when a third reader is needed.)
For ease of use, the CLAS-E Notes to the Scorer from the Writing Scoring Guide have been reformatted by Levels 1-4, with the notes taken verbatim from the Writing Scoring Guide. Programs must use this version.
Programs are Strongly Encouraged to use the Guidelines for Retesting with TABE CLAS-E Writing Assessment
The CLAS-E Writing Retesting Guidelines are intended to help programs determine either 1) if retesting using a different Writing level is needed, or 2) if student’s test scores are a reasonably accurate reflection of their abilities.
Important Note: Checking for Retesting is only needed during the first time a student is tested using the CLAS-E Writing (e.g., at the pre-test).
Programs have encountered difficulty in using TABE CLAS-E Writing scores for evaluating gain between pre- and post-tests. To determine whether a student’s score on their TABE CLAS-E pre-test is a reasonable reflection of that student’s proficiency, compare the student’s scale score from the CLAS-E test to the guidelines below that correspond to the difficulty level of the test taken (1, 2, 3, or 4). For each possible scale score, instructors are directed to one of three actions: retest immediately using the next lower level of the TABE CLAS-E, not retest at all, or retest immediately using the next higher level of CLAS-E.
The guidelines presented below were developed with careful statistical consideration of the standard error of measurement (SEM)2. The SEM is a statistical estimate of the amount of error to be expected in a particular score from a particular test, and provides the user with a range within which a student’s true score is likely to fall. Lower SEM is associated with more precise measurement while higher SEM means that an individual’s score contains more error and is less reliable. SEM is a reasonable indicator of the reliability of test results. An individual student’s observed score from a single testing experience is likely to fall within one SEM of the student’s true score 68% of the time, and within two SEMs 95% of the time. These guidelines are computed using the statistical properties of the tests to ensure that decisions as to whether or not students should be retested are based on the statistical levels of errors in the scores.
TABE CLAS-E Writing Assessment Retesting Guidelines Chart
CLAS-E Level 1
If your student’s scale score is 470 or below, DO NOT RETEST
If your student’s scale score is 471 or above, RETEST with LEVEL 2
CLAS-E Level 2
If your student’s scale score is 371 or below, RETEST with LEVEL 1
If your student’s scale score is between 372 and 513, DO NOT RETEST
If your student’s scale score is 514 or above, RETEST with LEVEL 3
TABE CLAS-E Writing Assessment Retesting Guidelines Chart, cont.
CLAS-E Level 3
If your student’s scale score is 420 or below, RETEST with LEVEL 2
If your student’s scale score is between 421 and 545, DO NOT RETEST
If your student’s scale score is 546 or above, RETEST with LEVEL 4
CLAS-E Level 4
If your student’s scale score is 495 or below, RETEST with LEVEL 3
If your student’s scale score is 496 or above, DO NOT RETEST
CLAS-E testers must recalibrate each time they score the Expository Writing to ensure consistent scoring accuracy among all CLAS-E test scorers. Before scoring tests, scorers must recalibrate themselves to the Writing Scoring Guide’s Rubrics, Notes to the Scorer, Supplementary Scoring Guide (released September 2011), and benchmark writing samples. Recalibration means re-familiarizing and aligning yourself with what the rubric defines for each score. When in doubt (e.g., if the benchmark writing samples do not appear to agree with the CLAS-E rubric), follow the rubric. Contact April Zenisky at firstname.lastname@example.org or Joan Ford to access the Supplementary Scoring Guide.
CLAS-E Scoring Monitor
Programs must designate one person as the CLAS-E Scoring Monitor. The monitor compiles all the scores and maintains them in one place. She/he ensures that scoring procedures are followed, and notes if any of the readers’ scores differ from each other by more than one point. If this occurs, the CLAS-E Scoring Monitor follows up to determine the final score and makes sure these two readers go through the calibration process again.
After Scoring Tests, Track Scoring Consistency
Program staff must track scoring consistency on a regular basis. If consistency slips, the staff trained in the CLAS-E and the Scoring Monitor need to discuss how to immediately rectify the inconsistency. The CLAS-E Scoring Monitor should track how many third readings are needed and the overall performance of readers. Programs are encouraged to contact Joan Ford to discuss and/or to provide additional training if needed. If a reader is frequently off by more than one point, then the program’s CLAS-E Scoring Monitor should immediately contact Joan for technical support.
CLAS-E scale scores must be recorded in SMARTT. Scale scores are the type of score used for the CLAS-E, and they are used to compute and derive all other scores associated with the CLAS-E Writing assessment. The SMARTT ABE database will translate the scale scores into the levels stipulated by the federal National Reporting System (NRS). Programs may generate reports that portray student NRS educational functioning levels and gains using SMARTT.
Exit Criteria for NRS Advanced ESL Level Students
A scale score of 612 and above in CLAS-E Writing Level 4 (both Forms A and B) is the exit criteria for students in the Advanced ESL level. Once students attain a score of 612, they need t exit the program at the end of the fiscal year if their primary assessment area is ESOL Writing.
TABE CLAS-E Writing Scale Scores for NRS Educational Functioning Levels Chart
* CLAS-E Writing Scale Scores are the combination of the multiple choice assessment and the expository writing folio scores. Following is another chart indicating how the CLAS-E test levels 1-4 correspond to the test’s scale scores, the SPL and NRS levels:
TABE CLAS-E Writing Scores’ Correlation to NRS (National Reporting System) EFL (Educational Functioning Levels) and SPL (Student Performance Levels) Chart
TABE CLAS-E Level
CLAS-E Writing Scale Scores*
NRS EFL Level
200 – 396
SPL 0 – 1
Beginning ESL Literacy
397 – 445
Low Beginning ESL
397 – 445
Low Beginning ESL
446 – 488
High Beginning ESL
446 – 488
High Beginning ESL
489 – 520
Low Intermediate ESL
521 – 555
High Intermediate ESL
556 – 612
* CLAS-E Writing Scale Scores are the combination of the multiple choice assessment and the expository writing folio scores.
Measuring Learner Gains
Learning gains are calculated each fiscal year from pre- and post-testing and based on learners’ first (pre-test) and last test.
Massachusetts measures learners’ educational gain in two different ways. First, Massachusetts reports the number of ESOL learners completing or advancing one or more Educational Functioning Levels (EFL) as defined by the US Department of Education’s National Reporting System (NRS). Massachusetts is required by the US Department of Education to not only report learning gains based on EFL completion rates, but also to use EFL completion rates as a measure of program performance.
In addition to measuring learning gains by EFL completion rates, Massachusetts measures “meaningful educational gain.” Meaningful educational gain is measured solely by the improvement in test scores between the pre- and post-test and does not take into consideration Educational Functioning Levels. The amount of scale score points that indicate meaningful educational gain is 25 or more scale score points.
Measuring completion of Educational Functioning Levels for the federal US Department of Education (National Reporting System, Federal Report Table 4)
The National Reporting System (NRS) requires that all students who have 12 hours or more of attendance be included in all Federal Report tables, including those reporting pre- and post-tested learners and those making gains by completing an Educational Functioning Level.
Copying over CLAS-E Scores to the Next Fiscal Year
Any CLAS-E Writing tests given to students in May or June may be rolled over to count as the first (pre) test in the new fiscal year. Program staff also have the option to give a new pre-test. The copy icon will appear next to all May/June tests so that users may copy the scores from any site to their own site in the new Fiscal Year.
The May/June test will then be dated July 1 of the new fiscal year. The July 1 date is color-coded to let program staff know it was copied.
Transitions programs, see specific policies at http://www.doe.mass.edu/acls/assessment/TCCPpolicy.html.
Program staff must read the Assessment Updates in the ACLS Monthly Mailings for important new information: http://www.doe.mass.edu/acls/mailings.
Please contact April Zenisky-Laguilles at email@example.com, or Joan Ford, SABES director of assessment professional development, at Bristol Community College, 777 Elsbree Street, Building Q, Fall River, MA 02720; Phone: (774) 357-2190; Email: firstname.lastname@example.org; FAX: (508)730-3280.
For policy-related questions, please contact Dana Varzan-Parker, Program and Assessment Specialist, at Adult and Community Learning Services (ACLS), 75 Pleasant Street, Malden, MA 02148; Phone: 781-338-3811; Email: email@example.com; Fax: (781) 338-3394.
1 Note that the date for the next assessment to be taken is based on four months, but adult learners in intensive programs who reach 65 hours of instruction may be tested before two months elapse. Programs may not test learners more than three times per fiscal year.
2 SEM is computed as where SD is the standard deviation of the test and r is the reliability. For TABE CLAS-E, reliability coefficients provided by the publisher are KR-20 estimates of internal consistency. SDs and KR-20 estimates are from the TABE Technical Report. The SEM for CLAS-E is computed in Sireci, 2011.