Request for Proposal (rfp)



Download 5.9 Mb.
Page4/38
Date09.08.2018
Size5.9 Mb.
TypeRequest
1   2   3   4   5   6   7   8   9   ...   38

3.2.1.20 Disaster Recovery Plan


The vendor shall provide a description of the plan to backup all systems, applications, and databases routinely to an onsite and offsite location. Additionally, the vendor shall detail the plan for data recovery in the event a disaster is declared where the data is maintained and stored. Database transaction logs must be archived and maintained online for 48 hours

3.2.1.21 Data Management


The Agency requires the vendor’s data management system interfaces with the Agency’s data management system. A general information file of individual student performance must be submitted to the Executive Director, Office of Technology, immediately upon completion of scoring. Please note that the vendor has three weeks to score multiple choice test options.
The Agency requires vendors to provide software solutions to the data management and disaggregation of data at the class, grade, school, county and state levels. In addition, disaggregated group reports must be available by:

  • Limited English Proficient

  • race/ethnicity (as specified in NCLB legislation)

  • gender

  • economically disadvantaged

  • migrant

  • students with disabilities

  • other groups as specified over the life of the contract

School, county, and state level reports will be provided by the vendor through a secure FTP site. The vendor will work in conjunction with the Agency to finalize the data and data layout. The Agency will have final approval of variable names and the formatting layout. All optional items for purchase will follow the established format.


Any proprietary software required (along with all software support) to read the data must be included for the Agency and updated throughout the contract. Vendors are to describe this software in full. If additional copies will be required at the county level, pricing for this must be included for the life of the contract.
3.2.1.22 Disposal/Final Delivery/Destruction of Materials

The vendor agrees to deliver to the Agency, or to destroy, upon request, all materials and products in all forms developed for and used in conjunction with this project within 30 days following acceptance by the Agency of the final report for the project, including:



  • test items and performance tasks

  • graphics

  • scoring materials

  • test books

  • answer documents

  • final electronic files of ancillary materials

  • computer discs, CDs, DVDs, or other media

  • computer listings

  • computer files

  • paper files

Payment of the final project invoice will not be made until all materials and certification of destruction, as appropriate, are received and approved by the Agency and final payment resolution is agreed to by both parties. Written verification of the delivery, or destruction, will be provided to the Agency as part of the final contract report.






Grades 3-11

(Contingent upon funding, the Agency reserves the right to activate this section as tests by grade levels at any time during the life of the contract)

  • WESTEST Grades 3-11

  • Alternate Performance Task Assessment

  • Modified Assessment

  • Writing Assessment

  • Algebra 1 End of Course

  • Online Classroom Assessment


3.2.2.1 GRADES 3-11 Introduction

The WESTEST 3-11 is designed to measure student achievement of the 21st Century WV CSOs and will also be used to measure school and county effectiveness as outlined in Policy 2320: Process for Improving Education: Performance Based Accreditation System. As required in the 2001 NCLB Act, only WESTEST Grades 3-8 and 11 will be used in AYP calculations. WESTEST Grades 9 and 10 will not be used for NCLB accountability calculations. (See Section 2.2 Table 1)


WESTEST grades 3-11 will have NRT/CRT customized test of 21st Century West Virginia Content Standards and Objectives for reading/language arts, mathematics, science and social studies. This section of the proposal will include the development of online forms per grades 3-11 to include a comparability study. Counties will be able to choose online or paper/pencil based administration of the assessment beginning in 2008-2009. Vendors must provide costs for two options.

Option 1: Multiple choice test items only

Option 2: Combination of multiple choice and constructed response items
WV requires the vendor to provide a detailed plan of how they propose to meet all development requirements. Vendors should elaborate on these where appropriate and provide new ideas regarding the development process when possible.
All elements of the development process including procedures, processes, and products used by the vendor to complete contract work are subject to final approval by the Agency. The Agency is to participate fully in all form construction. Any items developed specifically for the Agency will be owned by the Agency. The vendor will work closely with the Agency and groups of West Virginia educators to complete all work tasks. Throughout the contract period, the vendor will confer with the Agency on a continuing and consistent basis and will be involved in frequent face-to-face meetings with the Agency as needed.

3.2.2.1.1 Test Construction Specifics


The specifics for test construction for the tests in grades 3-11 in Reading/Language Arts, Mathematics, Science, and Social Studies must include:

  • Evidence that test forms align to the test blueprints (content, standards, objectives, thinking skill levels chart)

  • Test form format that follows Principles of Universal Design for assessments

  • Schedule of second copy reviews

  • Schedule of camera copy reviews

  • Schedule of blueline reviews

  • Description of process used to flag items based on Differential Item Function (DIF) analysis and what happens to these items

  • Description of the process to only allow the use of items within a .2 – .8 p-value range

  • Two electronic forms which may be phased in by schools who are ready for electronic test administration and related comparability study

Please note that the 21st Century West Virginia CSOs will be effective July 2008 and may be found at:



http://wvde.state.wv.us/policies/p2520.1_ne.pdf

http://wvde.state.wv.us/policies/p2520.2_ne.pdf

http://wvde.state.wv.us/policies/p2520.3_ne.pdf

http://wvde.state.wv.us/policies/p2520.4_ne.pdf

http://wvde.state.wv.us/policies/p2520.35_co.pdf
3.2.2.1.2 Blueprint/Item Specifications

The vendor will work with the Agency to plan a system for accomplishing the task of blueprint and item specification development. The vendor will develop in conjunction with the Agency a blueprint (CSOs and DOK thinking skill levels) for the Grades 3-11 WESTEST. The vendor must describe their process for developing and updating the test blueprints and item specifications over the life of the program. The Agency has the final approval on all test blueprints and item specifications and higher order thinking skills. It will be the vendor’s responsibility to electronically maintain and share with the Agency an updated electronic version of all item specifications.


The test design document will describe the relative emphasis of the standards, objectives, and thinking skill levels to be assessed. The document will include the approximate number of items to be included on each form of the assessment. The test design will also describe the measurement model and procedures implemented for WESTEST Grades 3-11. This blueprint document will form the basis for developing the first draft of the item specifications and the scoring specification documents. The vendor will provide in the proposal their process for creating test blueprints.
3.2.2.1.3 Item Specifications

After the award of the contract, the vendor will conduct a review of WESTEST Test Item Specifications, the current WV Content Standards and Objectives, and the required distribution of thinking skills to be provided in the test item selection per grade level, per content area. The vendor will follow the Agency specifications for all test item development for grades 3-11. The Agency will expect the vendors to align items to the final DOK to each standard and objective per grade level and per content. All DOK determinations will use the Webb alignment model and definitions of DOKs. The successful vendors will be provided with the DOK distribution charts for the 21st Century West Virginia Content Standards and Objectives by grade level per content area. The newly developed assessments will require a large number of items with higher thinking skill levels; therefore, all proposals must demonstrate higher levels of thinking. Based on the review, the vendor will provide and distribute an updated version of the specifications for all grades 3-11.


It will be the vendor’s responsibility to electronically maintain an updated electronic version of the item specifications. These specifications will be used throughout the development cycle by item writers and reviewers and at item review meetings with West Virginia educators and Agency staff. The review will be attended by Agency staff and may involve West Virginia educators. The vendor must provide a detailed description and provide examples of how item specification work was completed in other states or with other contracts.

Table 8: Grades 3-11 FIELD TEST DESIGN – 2007-2008
Option 1 with Customized Constructed Response Items







# Required Reading/

Language Arts Items

# Required Mathematics Items

# Required Science Items

# Required Social Studies Items

Grade

MC

CR

MC

CR

MC

CR

MC

CR

Grade 3

390

30

282

30

270

30

270

30

Grade 4

420

30

282

30

270

30

270

30

Grade 5

420

30

282

30

270

30

270

30

Grade 6

420

30

282

30

270

30

270

30

Grade 7

420

30

282

30

270

30

270

30

Grade 8

420

30

282

30

270

30

270

30

Grade 9

420

30

282

30

270

30

270

30

Grade 10

450

30

282

30

270

30

270

30

Grade 11

450

30

282

30

270

30

270

30

MC = Multiple Choice CR=Constructed Response

Table 9: Grades 3-11 OPERATIONAL DESIGN 2009-2014

Option 1 with Customized Constructed Response Items





# Required Reading/Language Arts Items

# Required Mathematics Items

# Required Science Items

# Required Social Studies Items

Grade

MC

CR

MC

CR

MC

CR

MC

CR

Grade 3

65

5

47

5

45

5

45

5

Grade 4

70

5

47

5

45

5

45

5

Grade 5

70

5

47

5

45

5

45

5

Grade 6

70

5

47

5

45

5

45

5

Grade 7

70

5

47

5

45

5

45

5

Grade 8

70

5

47

5

45

5

45

5

Grade 9

70

5

47

5

45

5

45

5

Grade 10

85

5

47

5

45

5

45

5

Grade 11

85

5

47

5

45

5

45

5

MC = Multiple Choice CR=Constructed Response
Table 10: Grades 3-11 Field Test Design - 2007-2008

Option 2 without Customized Constructed Response Items





# Required Reading/Language Arts Items

# Required Mathematics Items

# Required Science Items

# Required Social Studies Items

Grade

MC

MC

MC

MC

Grade 3

420

312

300

300

Grade 4

450

312

300

300

Grade 5

450

312

300

300

Grade 6

450

312

300

300

Grade 7

450

312

300

300

Grade 8

450

312

300

300

Grade 9

450

312

300

300

Grade 10

480

312

300

300

Grade 11

480

312

300

300

MC = Multiple Choice
Table 11: Grades 3-11 Operational Test Design - 2009-2014

Option 2 without Customized Constructed Response Items





# Required Reading/Language Arts Items

# Required Mathematics Items

# Required Science Items

# Required Social Studies Items

Grade

MC

MC

MC

MC

Grade 3

70

52

50

50

Grade 4

75

52

50

50

Grade 5

75

52

50

50

Grade 6

75

52

50

50

Grade 7

75

52

50

50

Grade 8

75

52

50

50

Grade 9

75

52

50

50

Grade 10

80

52

50

50

Grade 11

80

52

50

50

MC = Multiple Choice
3.2.2.1.4 Item Development/Selection Process

The vendor will work with West Virginia educators to review all available vendor items per grade level, per content area. West Virginia requires that items that have poor grammar, duplicitous answers, poor artwork, alignment problems, or other issues must be corrected/revised prior to the field test. All grades 3-11 test items that are selected for form construction must measure the knowledge and skills identified in the 21st Century WV CSOs. The vendor should provide in the proposal their process for creating test blueprints.


The vendor’s proposal will describe and specify the technical quality of their test development procedures in terms of their alignment to the validity and reliability of testing Principles as per the APA Standards. The test development procedures must result in items and passages created to generate reliable and valid performance as per the specifications and at the levels of the current WESTEST. During development, the consideration of additional technical factors should include, but not be limited to, test bias and fairness of items to all students, readability of test directions and items, and item analysis. (See Section 3.2.2.1.11 for submittal of test items for committee review.)
All selected passages and items and the new items developed by the vendor for grades 3-11 must be field tested prior to actual operational administration. After the field test, the vendor is to review appropriate item statistics with select Agency staff. In the proposal, the vendors will describe in writing for the Agency, and for later committee reference, the type of item statistics to be used in determining appropriateness of student performance data for use in the final test forms. In the final Technical Report, the vendor shall provide assurances that students with disabilities, limited English proficiency, and each of the West Virginia NCLB race/ethnic subgroups were appropriately included.

The vendor shall describe in detail the procedures to be used in detecting item bias and other technical quality principles to be used. The determination of whether, or not, each field tested item goes into the operational item pool to become eligible for use in operational tests is to be a joint decision between the vendor and the Agency with the Agency having final approval. The Agency plans to participate fully in the item selection process and form construction.


The test format, directions for administering, and item development shall use the following guidelines:

  • All test materials/answer documents must be formatted in a manner assuring ease of readability. The style and font size of type used shall be large enough to be easily readable and have physical characteristics that research has indicated facilitate reading.

  • The time limits will be reflected as guidelines in directions read orally to students by the test administrator. The directions will indicate to students the testing schedule.

  • Items must not elicit any information that would be considered in the realm of “values and/or morals” and the student should not be led to discuss religious, or political values, or any issue that would invade personal, or family privacy.

  • Verbal information in the item should be appropriate in vocabulary and general readability for the grade level tested.

  • The operational exam will be administered during a two-week window. Week 1 will be the testing window and Week 2 will be the make-up window. Two forms will be constructed and will be rotated annually. The form not used will be the breach form.

  • All test booklets must be available in Braille and large print formats.


3.2.2.1.4.1 Universal Design Principles

The implementation of the Universal Design Principles is a critical feature of the NCLB legislation. The vendor will provide the test documents, to be held confidential, to the Agency for review of content alignment, quality of item construction/answer selection, thinking skill levels, page layout, art work, and Universal Design Principles for all content areas. The requested information will be made available for committee review for all review participants who sign a security agreement with the Agency. The vendor must document the procedures used to assure that Universal Design Principles have been used in development of test items and must describe in detail how principles of Universal Design are implemented in the creation of items, forms, and all other student assessment materials. The Vendor should provide samples of the Universal Design protocols for committee review.


3.2.2.1.5 Reviews

The vendor shall be responsible for coordinating and conducting the Content Review Committee meetings and the Bias Review Committee meetings in West Virginia. The Face-to-Face Review will be performed at the vendor’s site, or at a location mutually agreeable to both the Agency and the vendor. The vendor must include a schedule of activities, timelines, anticipated number of days for completion, and number of participants required for conducting these meetings. The vendor must submit their protocol for conducting Content, Bias, and Face-to-Face Reviews. The vendor shall have appropriate staff on-site to successfully conduct all review meetings. The Agency retains the right to involve West Virginia teachers in all reviews. Committee members must have access to computers in order to experience the test passages and items in an online format. The Agency will provide the computers for accessing the testing materials during the reviews. All proposals should provide examples of vendor protocols and procedures for committee review.


The vendor will be responsible for all costs and arrangements related to the review meetings for vendor personnel and meeting materials. The Agency will assume the cost for the facility, refreshments, and lunch, as well as travel reimbursements (hotel, mileage, meals) for non-vendor participants.
3.2.2.1.5.1 Content Review Meetings

The vendor will outline a plan for a review of test passages and items/prompts and their scoring rubrics by the West Virginia Content Review Committees. The plan will include content reviews of all items with particular emphasis on the congruency of items with test specifications, readability requirements, Universal Design Principles, technical quality, content match, and continuity and articulation of skills across the assessed grade levels. Procedures and materials for orienting reviewers must be described. Content Review Committees will be established to ensure the test passages and items/prompts accurately reflect item specifications and test blueprints. The vendor shall provide a minimum of one qualified item writer to facilitate the content reviews for each grade/content area. The vendor shall be responsible for preparing all meeting minutes and providing monthly reports to the Agency. All proposals should provide examples of vendor protocols and procedures for committee review.


3.2.2.1.5.2 Bias Reviews

The vendor will be responsible for providing a plan for a bias/sensitivity review of test items. The Bias Review Committee will review all of the items to ensure test items are fair and free of bias for all students, including students who are visually impaired and/or hearing impaired, or deaf. The Bias Review Committee will meet prior to the field test. There will be one additional meeting of the Bias Review Committee for the purpose of reviewing reading passages, which takes place prior to initial item development. All proposals should provide examples of vendor protocols and procedures for committee review.


The vendor shall provide a minimum of one qualified item writer for each grade/content area to facilitate the Bias Reviews. The Agency may conduct a separate Bias/Sensitivity Review, with the results being provided to the vendor. Agency personnel will be in attendance during the review sessions.
3.2.2.1.5.3 Face-to-Face Item Selection Final Review

Following the field test, the vendor will provide a plan for Face-to-Face Item Review meetings with the Agency staff at the vendor’s site, or at a different location that is mutually acceptable for the Agency and the vendor. The vendor must have at least one item writer per content area/grade level with related content expertise available to successfully conduct Face-to-Face Item Review meetings. The Agency expects each vendor to provide recommendations concerning the timeline and number of participants required for this activity. At least two weeks prior to the Face-to-Face Item Review, the vendor will provide the Agency a written report including revisions of the items, as well as comments and suggestions on the content and editorial issues, based on the content and equity reviews. All proposals should provide examples of vendor protocols and procedures for committee review.


3.2.2.1.6 Form Development Process

3.2.2.1.6.1 Field Test Development

The vendor shall be responsible for developing and producing field test forms, answer documents, manuals, and ancillary materials that resemble, to the greatest extent possible, the operational assessment materials. Braille and large print versions are required for the field test.


The vendor will design the field test to provide information to evaluate test items and to evaluate the overall test design. The vendor will describe the characteristics of the field test providing calibration results, scoring procedures, and other data from the 2008 field test. The documentation must also address the reliability of scores and dimensionality of the proposed operational tests, and include multiple methods for estimating the underlying dimensionality of test data as determined in consultation with and approved by the Agency. A written report of the results of this investigation will be presented to the Agency before October 2008. The vendor will make recommendations for scaling approaches in the event of appreciable multidimensionality. The contractor will communicate frequently with the Agency during the field test period.
It is the responsibility of the vendor to group the newly developed items into item sets to be included in the field test forms. Using items that have been revised, as determined after the Item Review Meetings, the vendor will develop a plan that indicates which of the items should be field tested. Two electronic forms must be developed for schools that are prepared to administer online tests. The Agency will review the vendor’s plan and approve, or request revisions. The determination of whether, or not, each item goes into the item pool for use in operational tests is to be a joint decision between the vendor and the Agency with the Agency having final approval. The Agency is committed to reviewing products submitted by the vendor as efficiently as possible. All technical data requested in this section must be included in the Field Test Technical Report. All proposals should provide examples of vendor protocols and procedures for committee review.
3.2.2.1.6.2 Operational Form Development

The requirements for constructing operational tests include creating test construction specifications for building the forms. Vendors must detail their specifications and form building processes to include alignment to CSOs, p-values and point biserial information. In addition, test construction must be supported by sophisticated computer software that will generate test characteristics.


The vendor will design and produce any camera-ready art for the test forms. Two Operational Test forms measuring the 21st Century WV CSOs will be selected (along with ancillaries) and moved through the vendor’s form development process to camera copy for each operational administration.
The Agency elects to participate in the WESTEST operational forms development. The Agency may choose to modify the design of the test books and ancillaries prior to any test administration, within the constraints of the specifications for test books and answer documents provided, and reserves the right to change this configuration, if necessary, through change orders, or appropriate contract amendments. This contract requires the vendor to provide two online forms for those schools which are ready to administer WESTEST online.
3.2.2.1.6.3 Ancillary Product Development

The vendor will develop for Agency approval and print all publications, materials, and forms in compliance with Agency printing specifications. All print color proposed by the vendor must be easily read by populations with visual impairments. All print and ink color proposed by the vendor must be approved by the Agency and samples must be provided for committee review. The specifications and quantities for the major products to be printed are found in Tables 12 and 13 below.


Table 12: WESTEST Grades 3-11 West Virginia Materials Production Specifications for the 2008 Field Test of NRT and Augmented Items Assessment


Item #

Admin.

Grade

Item Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

1

Spring 2008 FT

3

Field Test Book

2c/2c

10s







2

Spring 2008 FT

3

Scannable Answer Document

2c/2c

10s







3

Spring 2008 FT

4

Field Test Book

2c/2c

10s







4

Spring 2008 FT

4

Scannable Answer Document

2c/2c

10s







5

Spring 2008 FT

5

Field Test Book

2c/2c

10s







6

Spring 2008 FT

5

Scannable Answer Document

2c/2c

10s







7

Spring 2008 FT

6

Field Test Book

2c/2c

10s








Table 12: WESTEST Grades 3-11 West Virginia Materials Production Specifications for the 2008 Field Test of NRT and Augmented Items Assessment (Continued)


Item #

Admin.

Grade

Item Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

8

Spring 2008 FT

6

Scannable Answer Document

2c/2c

10s







9

Spring 2008 FT

7

Field Test Book

2c/2c

10s







10

Spring 2008 FT

7

Scannable Answer Document

2c/2c

10s







11

Spring 2008 FT

8

Field Test Book

2c/2c

10s







12

Spring 2008 FT

8

Scannable Answer Document

2c/2c

10s







13

Spring 2008 FT

9

Field Test Book

2c/2c

10s







14

Spring 2008 FT

9

Scannable Answer Document

2c/2c

10s







15

Spring 2008 FT

10

Field Test Book

2c/2c

10s







16

Spring 2008 FT

10

Scannable Answer Document

2c/2c

10s







17

Spring 2008 FT

11

Field Test Book

2c/2c

10s







18

Spring 2008 FT

11

Scannable Answer Document

2c/2c

10s







19

Spring 2008 FT




Test Examiner’s Manual

1c/1c




125

1

20

Spring

2008 FT


3-11

Math Manipulatives

4 color










21

Spring 2008 FT

3-11

County Test Coordinator’s Manual

1c/1c




50

1

22

Spring 2008 FT

3-11

District Header

3c




2

1

23

Spring 2008 FT

3-11

School Header

3c




2

1

24

Spring 2008 FT

3-11

Answer Book Envelopes










1


Table 12: WESTEST Grades 3-11 West Virginia Materials Production Specifications for the 2008 Field Test of NRT and Augmented Items Assessment (Continued)

Item #

Admin.

Grade

Item Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

25

Spring 2008 FT

3-11

Shipping Labels

Color coded












1

26

Spring 2008 FT

3-11

Materials Checklist

1c




2

1

27

Spring 2008 FT

3-11

Security Checklist

1c




Triplicate Forms






Table 13: WESTEST Grades 3-11 West Virginia Materials Production Specifications for the 2009 Operational of Supplemental Materials

Item #

Admin.

Grade

Item Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

1

Spring 2009 -- 2014 OP

3

Operational Test Book

2c/2c

10s







2

Spring 2009 -- 2014 OP

3

Scannable Answer Document

2c/2c

10s







3

Spring 2009 -- 2014 OP

4

Operational Test Book

2c/2c

10s







4

Spring 2009 -- 2014 OP

4

Scannable Answer Document

2c/2c

10s







5

Spring 2009 -- 2014 OP

5

Operational Test Book

2c/2c

10s







6

Spring 2009 -- 2014 OP

5

Scannable Answer Document

2c/2c

10s







7

Spring 2009 -- 2014 OP

6

Field Test Book

2c/2c

10s







8

Spring 2009 -- 2014 OP

6

Scannable Answer Document

2c/2c

10s







9

Spring 2009 -- 2014 OP

7

Operational Test Book

2c/2c

10s







10

Spring 2009 -- 2014 OP

7

Scannable Answer Document

2c/2c

10s








Table 13: WESTEST Grades 3-11 West Virginia Materials Production Specifications for the 2009 Operational of Supplemental Materials (Continued)

Item #

Admin.

Grade

Item Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

11

Spring 2009

8

Operational Test Book

2c/2c

10s







12

Spring 2009 -- 2014 OP

8

Scannable Answer Document

2c/2c

10s







13

Spring 2009 -- 2014 OP

9

Operational Test Book

2c/2c

10s







14

Spring 2009 -- 2014 OP

9

Scannable Answer Document

2c/2c

10s







15

Spring 2009 -- 2014 OP

10

Operational Test Book

2c/2c

10s







16

Spring 2009 -- 2014 OP

10

Scannable Answer Document

2c/2c

10s







17

Spring 2009 -- 2014 OP

11

Operational Test Book

2c/2c

10s







18

Spring 2009 -- 2014 OP

11

Scannable Answer Document

2c/2c

10s







19

Spring 2009 -- 2014 OP




Test Examiner’s Manual

1c/1c




125

1

20

Spring 2009 -- 2014 OP

3-11

Math Manipulatives

4 color










21

Spring 2009 -- 2014 OP

3-11

County Test Coordinator’s Manual

1c/1c




50

1

22

Spring 2009 -- 2014 OP

3-11

District Header

3c




2

1

23

Spring 2009 -- 2014 OP

3-11

School Header

3c




2

1

24

Spring 2009 -- 2014 OP

3-11

Answer Book Envelopes










1

25

Spring 2009 -- 2014 OP

3-11

Shipping Labels

Color Coded












1

26

Spring 2009 -- 2014 OP

3-11

Materials Checklist

1c




2

1


Table 13: WESTEST Grades 3-11 West Virginia Materials Production Specifications for the 2009 Operational of Supplemental Materials (Continued)


Item #

Admin.

Grade

Item Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

27

Spring 2009-2014 OP

3-11

Security Checklist

1c




Triplicate Forms



The vendor shall also print any additional materials proposed in the implementation of this project, such as transmittal memoranda, labels for packing, and packing lists. The vendor shall be responsible for all aspects of production for publishing printed products, including formatting, graphics, and key entry. For each publication in Tables 12 and 13, the vendor shall submit for approval, printing plans that identify type size and style, ink and paper color, paper quality, and layout. Agency desires attractive, high resolution quality printed materials at reasonable cost.


3.2.2.1.6.4 Art and Production

Interesting, attractive design is required for all test products developed by the vendor. These designs include the organization, format, page layout, and covers required for test books, reports of assessment results, information publications, and other printed materials. The vendor will produce all graphics, charts, and illustrations for the products for which it is responsible. All graphics, charts, and illustrations will be age appropriate and approved by the Agency. The vendor must provide examples of test products created for other programs as part of the proposal. (See Section 3.2.2.1.12)


3.2.2.1.7 Accommodations

Braille and large print test booklets and answer documents will be offered as accommodations for students with disabilities. All products created for use by students will have Braille and large print versions at each grade level for visually impaired students. Established publishers of Braille and large print materials approved by the Agency will produce the large print and Braille versions of the test books, answer documents, and other documents at the vendor’s expense. The large print materials will meet, or exceed, the font requirements suggested by the APHB. (Refer to http://www.aph.org/) These documents will be produced and will be delivered to counties in the same shipment with the regular format versions of the products.


3.2.2.1.7.1 Braille

Regular test booklets shall be produced in Braille at each grade level. Approximately 10 booklets per grade level are required. The vendor is responsible for production of camera-ready formats and files to be provided to the Braille subcontractor. Any manipulatives will also be brailed.


The vendor should designate a person in the proposal (See Section 3.2.4.6.5) as knowledgeable with regard to brailling, will give the Agency assurance that all tests produced are modified correctly, and will assume final responsibility for the accuracy of the Braille test instruments. The Braille test booklets/answer documents will be made available for review and approval by the Agency prior to reproduction. The Agency may employ the services of Braille proofreaders. In addition, the vendor is responsible for having the Braille materials proofed by an independent party.
Test administrator notes and scripts to accompany Braille test versions will also be developed. Supplemental directions for transferring of responses must be provided as needed to the test administrators. The APHB guidelines for brailling must be honored. (Refer to http://www.aph.org)
3.2.2.1.7.2 Large Print

Large print test booklets and answer documents will need to be created. Approximately 35 test booklets and other pertinent materials per grade will be required. The format will be horizontal, or full-view format, with at least 18-point font. The closed book size will not be larger than 14" by 17" on approved paper. Reformatting of documents may be necessary to meet these specifications.


The binding must allow the test booklet to open flat. The test booklets and answer documents will be made available for review and approval by the Agency prior to reproduction. The APHB guidelines for large print must be honored. Refer to http://www.aph.org. The vendor should provide examples of large print materials.
3.2.2.1.8 Online Pilot Testing Option

The vendor shall conduct a small scale pilot of online testing and scoring for administration of the WESTEST program. The online administration will be available in 2009 as an option to the paper-and-pencil version for counties wishing to administer the test electronically.


The online pilot shall be conducted for randomly selected grades in Mathematics and the 7th grade Writing Assessment component. As part of the pilot, online assessments must be completed by a minimum of 5,000 students per content area. The vendor in conjunction with the Agency will identify a representative sample of schools to participate in the pilot and shall provide all necessary materials and technology (software, hardware, connectivity) to complete the pilot assessment. The contractor shall provide scoring of assessments completed in the online pilot.
If indicated, the vendor shall complete a study of the comparability of the online assessments and the paper/pencil assessments and shall provide a written report summarizing the findings of the study to the Agency. The vendor shall provide evidence of the validity of online testing and scoring in the context of a large-scale assessment.
The vendor shall submit a detailed written plan to the Agency of an expansion of the online assessment to include additional content areas and/or additional grade levels as specified by the Agency. The vendor’s plan for the expanded online assessment shall identify specific activities that will be performed by the vendor and a guaranteed not-to-exceed total price for the online assessment. Such guaranteed not-to exceed total price shall be based upon the firm, fixed price per student for the Online Assessment Pilot as stated in the pricing proposal.
3.2.2.1.9 Content/Form Management System

3.2.2.1.9.1 Introduction

The vendor shall have the ability to acquire and manage items, statistical data, item codes, test booklet formats, and other pertinent information from their own (and subcontractors’) item development, scoring, and psychometric systems. The Agency will have continuous access to this system on a twenty-four hour, seven days per week basis.


The data (the items and related information) will be placed into a secure content management system that the Agency can access using either Web-based technology, or other methods proposed by the successful vendor. The proposal must contain a detailed plan to continually acquire and inventory all West Virginia content and item/form data across the life of the contract. At an absolute minimum, it will be the responsibility of the vendor to update and maintain the actual data in the management system after each field test, or operational administration. Item information will be updated each time an item is used either as a field test, or operational item. If applicable, transferring existing information between vendors’ management systems will be the responsibility of the primary vendor as designated by the Agency. The plan must also provide details of how the items and the related materials, will be converted to the vendor’s content management system, if applicable. Past experience suggests that transfer of items, their accompanying graphics, and their statistical information from one system to another is problematic and time consuming. Historical information will be retained for all items, including any developed prior to this project. Vendors must build this consideration into their proposals.
The vendor shall provide assurances/solutions in the proposal to ensure this level of availability. The resolution images of items, item graphics, and reading passages in the system must be considered. Vendors must provide copies of their adequate quality control on the accuracy of the system and the data for the committees’ review. Quality control checks of the data in the content management system are mandatory and vendors must specify the procedures to be utilized for this purpose.
The content management system must meet the requirements of this section and shall maintain the ability to use historical information about all items. Items and passages developed and utilized during the course of the project described herein will be added to the system according to the schedule approved by the Agency, and these items will become the property of West Virginia.

.

3.2.2.1.9.2 Descriptive Information for Content/Form Management System

Descriptive information associated with each item to be included in the Content/Form Management System should include:


  • content area

  • grade

  • reporting category

  • content standard and grade level objective alignment

  • thinking skill levels

  • item identification and classification codes

  • item type

  • test forms

  • position in the test book(s)

  • number of answer choices

  • answer key (list of correct responses and scoring rubrics)

  • passage/stimulus name

  • images of items

  • item graphics and other art or stimuli associated with items

  • administration date

  • status (field test or operational test) and

  • history of use including test form(s), page, and item number

  • history/tracking of changes/edits made to items during reviews, etc.

  • identify the person who requested or authorized changes/edits


3.2.2.1.9.3 Psychometric Information for Content/Form Management System

Of particular significance is how all of the psychometric data will be collected, analyzed, verified, and stored for future use in building test forms. The content/form management system must be capable of retrieving and utilizing both traditional and IRT item parameter values for use in test item selection and test construction procedures. The plan must contain what information will be included, how missing parameters, or statistics, will be acquired, or computed, any transformations needed of parameters using historical data files, and any limiting issues anticipated, along with proposed solutions.


The content management system must be able to incorporate item data, when appropriate, that may include:

  • maximum item information

  • location of maximum information

  • IRT parameters and statistical values

  • fit index, chi-square values

  • difficulty values

  • classical item analysis for distractors

  • DIF statistics, including contrast values for Mantel-Haenszel comparisons.

The vendor may suggest other psychometric data that needs to be collected. The Agency requires a Technical Report to include the above mentioned data for the field test as well as for the operational administrations.


3.2.2.1.9.4 Software/Hardware Concerns for Content/Form Management System

Security of the content/form management system is of utmost importance to the Agency. The Agency must have twenty-four hour, seven days per week continuous access to this secure site. All update costs for software/hardware must be defined in the proposal and the costs must be honored throughout the life of the contract. The costs will be evaluated in the cost proposal.


3.2.2.1.9.5 Optional Content/Form Management System Tasks

Vendors may propose additional content/form management system tasks, or activities, if they will substantially improve the results of the project. The additional tasks, or activities, must be described in detail within this section, yet separated from the required items in the cost proposal.


3.2.2.1.10 Copyright Issues

The item selection development plan must include a schedule for acquiring copyright permission as needed for new development and uses, as well as for future uses of any existing passages. Vendors must fully describe their experience with obtaining and retaining copyrights, the processes used to perform this, and a description of the specific personnel to be assigned to this task and their appropriate experience.


The vendor is responsible for maintaining copyright agreements obtained by any means and for securing agreements with copyright holders for continuing use of test passages for a period of 10 years, for a variety of potential purposes as follows:

  • publication in the Grades 3-11 tests over the life of the contract

  • publication in any interpretive or public usage products

  • use in interpretive or public usage products in the form of electronic media distributed to counties, or other parties

  • use in the form of electronic media for Internet, or any future electronic access

The vendor is liable for assuring all copyright permission acquisitions. The vendor’s proposal should explicitly make assurance that the Agency will be held harmless should a vendor fail to acquire copyright permissions.


3.2.2.1.11 Examples of Items

The vendor must provide examples of multiple choice items, constructed response items, gridded response items and any other item format that the vendor may deem appropriate at a grade level of the vendor’s choice for each content area. All examples must show thinking skill levels and alignment to the 21st Century WV CSOs.



3.2.2.1.12 Examples of Page Layouts

The vendor must provide a two page layout for a single grade of the vendor’s choice for each content area. Layout should demonstrate ink coloration, font size, white space, paper quality, passage to item layout, presentation of answer options, art work, method of communicating “stop” and “go on” option to students, numeration, and solution for addressing blank pages within the text booklet.


3.2.2.1.13 Item Alignment to the 21st Century WV CSOs

Items may be needed to be developed to augment the WESTEST NRT/CRT component for grades 3-11. Items and passages must be high quality and must align with the 21st Century West Virginia Reading/Language Arts, Mathematics, Science and Social Studies Content Standards and Objectives (CSOs). Vendors must submit their processes to ensure that the development of all passages and items will be in direct alignment with West Virginia’s CSOs using the Webb Alignment Model. (See Section 3.1.2) The alignment criteria may need to be adjusted with any future state or federal requirements. The Agency will contract with an independent vendor to conduct an external alignment study.


Vendors should be aware that future development schedules might be adjusted depending on the revision schedule for the CSOs. In this event, the existing blueprints and item specifications will need to be updated to reflect any changes in the standards, skills or the classification system for the objectives.
3.2.2.1.14 Psychometric Research and Technical Services

3.2.2.1.14.1 Introduction

Maintaining test validity, reliability, and the equivalence of tests and score scales across years is a fundamental priority of the WESTEST Grade 3-11 program. The vendor will be responsible for psychometric services related to the development of valid, reliable, generalizable, equitable (by race, ethnicity, gender, and all other applicable criteria), bias-free, and legally defensible assessments.


These services shall include: planning and coordination of data design and analysis; producing descriptive statistics; using an Item Response Theory (IRT) model to scale the field test items; standard setting; providing measurement consultation; and conducting any additional special studies as needed to document the technical adequacy of the tests. Additional studies may be proposed by the vendor for consideration.
The IRT model used for the field test data must be the same as that used for the operational tests. Multiple-choice items are to be calibrated and scaled using the three-parameter, logistic model (3PL); and constructed-response/gridded-response items are to be calibrated and scaled using the two-parameter, partial credit model (2PPC). Any other formats suggested by the vendor must meet these same specifications and the vendor must provide the scoring model for the additional item types.
Vendors should note the technical specifications outlined in this section reflect minimum requirements. Vendors must provide a detailed plan of all proposed psychometric research and technical services necessary to deliver quality products. All test design activities must be conducted according to the most recently published version of the Standards for Educational and Psychological Testing (AERA, APA, and NCME). The successful vendor must provide a Technical Report for the field test and each operational test administration. A copy of a sample Technical Report should be submitted for review as per this RFP.
The vendor will conduct and provide results from a comparability study of WESTEST 8, 10, and 11 and the grades 8 and 10 college predictive exam benchmark and the grade 11 college entrance exam benchmark. A Technical Report of the results of this study will be bound and presented to the Agency.
3.2.2.1.14.2 Descriptive Statistics

Following each field test, or operational administration, the vendor will provide the Agency with a detailed item analysis. The vendor will provide Item Response Theory-based and classical statistics for all field test items.



  • For multiple-choice items, the analysis will divide the student population into the five performance levels (distinguished, above mastery, mastery, partial mastery, and novice) using total performance on the test, and show the relationship between answer choices and level of performance.

  • For gridded-response items, the analysis will show the relationship between total test performance category and the six most frequent answers given by students.

  • For constructed response/performance tasks, the analysis will demonstrate the proportion of students in total test performance category achieving each score point.

  • For any other suggested item format, the vendor will provide the required relational data.

For all items, the analysis will indicate the correlation between the item and performance on the relevant subscore category and the total test. The vendor will provide, at a minimum:



    • difficulty estimates, p-values, and coefficient alpha reliability for every item to include point biserials for all multiple choice items

    • alpha reliability estimates for each test form and standard

    • inter-rater agreement indices if applicable

    • differential item functioning (DIF) statistics (using at least two procedures)

    • comparability study for paper/pencil vs. online instruction

DIF analyses to detect possible item bias are to be conducted for Caucasian, African-American, Asian/Pacific, and Hispanic racial/ethnic groups, as well as for gender and all other subgroups identified by NCLB as appropriate. Values for items resulting from these (and all other pertinent) analyses will be included in the vendor’s content/form management system. Changes in DIF values across administrations will be analyzed.


Other descriptive analyses will include, but not be limited to, means and standard deviations for the population and by each demographic subcategory (i.e., gender, ethnicity, disability status, and English-speaking status), item total correlations, and frequency distributions. IRT analyses of field test data will include Item Characteristic Curves (ICC) for all the parameters for each item. Vendors are encouraged to propose additional analyses, based on their experience and emerging statistical theory. The vendor must provide a sample of comparability protocols or other comparability studies for committee review.
3.2.2.1.14.3 Validity

In alignment with the Agency’s requirements for technical excellence, the vendor will be responsible for establishing and documenting various evidences of validity. This will include, but not necessarily be limited to, the following:



  • evidence of content validity: the alignment to WV CSOs, test blueprints, item specifications, and test items (See Section 3.1.2)

  • evidence that test item formats measure the intended content

  • evidence of the interrelationship among standards

  • evidence items were chosen based on test specifications

  • evidence that alternate forms cover the same content

As per the NCLB Peer Review Guidance Document, the agency requires that the vendor provide data/studies to provide the Agency with the following:



  • Evidence that scoring and reporting structures are consistent with the sub-domain structures of its academic content standards (i.e. are item interrelationships consistent with the framework from which the test arises)

  • Evidence that the Agency has ascertained that test an item scores are related to outside variables (specifically ACT and NAEP) and with irrelevant characteristics, such as demographics

Vendors should provide a plan for demonstrating the validity evidences mentioned above, along with any others, based on their experience and emerging statistical theory. In addition, the vendor must provide validity evidence regarding any applicable surveys, technical studies, contrasting groups studies, descriptions of norm groups, test form equating, test level equating, and field tests. All costs for additional studies which are needed to provide the above information must be included in the vendor’s cost proposal delineated by study and cost.


3.2.2.1.14.4 Reliability

The vendor will be responsible for establishing and documenting methods to collect evidence of the reliability of test scores, and, if included, the scoring of constructed response items. This evidence of test score reliability will include, but not necessarily be limited to:



  • scale score/performance level determinations

  • classical and IRT measures of reliability and standard errors for totals and subscores

  • analysis of model fit

  • analyses of item local dependence

  • evidence of the reliability of the constructed response scores

  • analyses of the reliability of classification decisions in relation to achievement levels

  • level classifications

  • inter-rater reliability, if applicable

  • frequency distributions of student scores on constructed response items

  • internal consistency of total scores

  • decision consistency

  • generalizability estimates of standard errors

  • reliability and standard errors of group mean scores and classifications of students by group

  • reliability and standard errors of year-to-year changes in school means

As per the NCLB Peer Review Guidance Document, the agency requires that the vendor provide data/studies to provide the Agency with the following:



  • reliability of the scores it reports, based on data from its own student population and each reported subpopulation

  • quantified and reported within the technical documentation for its assessments the conditional standard error of measurement and student classification that are consistent at each cut score specified in its academic achievement standards

  • reported evidence of generalizability for all relevant sources, such as variability of groups, internal consistency of item responses, variability among schools, consistency from form to form of the test and inter-rater consistency in scoring

Vendors must provide a plan for demonstrating the reliability evidences mentioned above. In addition, the vendor must provide reliability evidence regarding any applicable surveys, technical studies, contrasting groups studies, descriptions of norm groups, test form equating, test level equating, and field tests. All costs for additional studies which are needed to provide the above information must be included in the vendor’s cost proposal delineated by study and cost.


3.2.2.1.14.5 Scale Scores - Calibration Scaling and Equating Procedures

The processing and scoring of the WESTEST Grades 3-11 answer documents requires the calibration, scaling, and equating of student responses. Vendors must provide a detailed plan that articulates the scaling procedures used for assessments.


Responses to test items will be analyzed using an Item Response Theory (IRT) model. In West Virginia, multiple-choice items are to be calibrated and scaled using the three-parameter, logistic model (3PL); and all constructed-response/gridded-response items are to be calibrated and scaled using the two-parameter, partial credit model (2PPC). All item parameters are to be placed on a common scale. Student total scores are based on item parameter estimates and are to be obtained using pattern scoring. These analysis procedures will be implemented using a probability sample of West Virginia schools.
The vendor is responsible for designing a sampling plan, calibrating the current test, and equating the score scale of the current test to the base year of the assessment. This “calibration sample” should include at a minimum 10,000 students per grade level appropriately representative according to West Virginia student demographics and the NCLB subgroups (economically disadvantaged students, students with disabilities, students with Limited English Proficiency (LEP), major racial and ethnic groups and gender). The separate writing score must be scaled and merged with the reading scores to generate the WESTEST Reading/Language Arts Report.
The first operational year, 2008-2009, will serve as the base-year scale for item parameters and scores. The second operational test form for each grade will be equated to the base-year scales using the Stocking and Lord Procedure.
Braille versions of the tests not containing the same items as the regular tests are to be calibrated and scaled separately. Data from all students included in the calibration sample are used to scale Braille test versions, after removing items from the operational item set not presented to visually impaired students. The vendor may propose a process for calibrating and equating, with justification, which differs from the previously described process.
3.2.2.1.14.6 Statistical Software

The vendor may utilize proprietary software for calibration, scaling, and equating, but should provide a fully licensed copy (or a copy of implemented “shelf” software) to the Agency for this confidential review. This software must be transferable to an Agency subcontractor, if required. Vendors must describe hardware prerequisites and the provided training associated with this software within their proposal. In their proposal, vendors are to provide the name, historical usage, and an overview of the software to be used to complete this task. Additionally, vendors are to provide a description of the technical support for the software and transition plans should the software need to be upgraded, or replaced. No cost technical support must be provided for the length of the contract.


3.2.2.1.14.7 Vertical Scaling

Student achievement on West Virginia tests are reported using scale scores and vertical scale scores, among others. Vendors are must create a vertical scale providing for reporting growth continuously from Grades 3-11 for the 3-11 package, or for Grades K-11 if bidding the K-2 package.


The collection of vertical scaling data will embed items as part of the Spring 2008 field test. In addition to this method, vendors may propose alternative strategies, if desired. The vendor will describe the procedures and requirements for conducting vertical scaling prior to the selection of items and construction of forms. Any alternate procedures must be approved by the Agency. Vertical scaling must encompass all items including the CRT and NRT portions of the assessment.
3.2.2.1.14.8 Standard Setting

Vendors must provide a detailed plan regarding the procedure recommended for the Agency to determine cut scores, as well as cut score ranges for performance standards for each grade level K-2, or K-11. The Agency will have final approval of all cut scores and will have the final approval on the methodology used to derive such scores.


The vendor is responsible for facilitating the Agency’s process to establish achievement level standards in consultation with West Virginia educators and citizens. The WV CSOs have associated performance descriptors that provide the basis for assessing overall student competence of grade level standards. With the ultimate goal of “learning for all,” these descriptors allow the teacher, students and parents to judge the level of student proficiency in each 21st Century learning standard. The five West Virginia Performance Levels are defined below:
Distinguished: A student at this level has demonstrated exceptional and exemplary performance. The work shows a distinctive and sophisticated application of knowledge and skills that go beyond course, or grade level expectations.

Above Mastery: A student at this level has demonstrated competent and proficient performance and exceeds the standard. The work shows a thorough and effective application of knowledge and skills.

Mastery: A student at this level has demonstrated fundamental knowledge and skills that meet the standard. The work is accurate, complete, and fulfills all requirements. The work shows solid academic performance at the course, or grade level.

Partial Mastery: A student at this level has partially demonstrated fundamental knowledge and skills toward meeting the standard. The work shows basic but inconsistent application of knowledge and skills characterized by errors and/or omissions. Performance needs further development.

Novice: A student at this level has not demonstrated the fundamental knowledge and skills needed to meet the standard. Performance at this level is fragmented and/or incomplete and needs considerable development.
Performance standard setting methods must include both test-based methods and student-based methods. The vendor will be responsible for developing specifications for the achievement level standard setting process and utilizing IRT item values in a bookmark-based procedure. The specifications will address the nature of the proficiency level standards, methods for determining the standards, and procedures for validating and analyzing the quality of information reported using the achievement levels. The vendor’s proposed timeline for completing the proposed services must respect the Spring 2008 field test administration and the Spring 2009 operational test dates. Standard Setting must be completed in Fall 2008.
The vendor will be responsible for organizing and implementing the standard setting process, based on the achievement level specifications, and for conducting the standard setting meetings (in conjunction with the Agency). The standard setting process will involve meetings of Standard Setting Committees, consisting of West Virginia teachers and instructional leaders.
The vendor will also be responsible for developing a Standard Setting Technical Report that outlines the processes, procedures, materials, etc. used in the Standard Setting. The Standard Setting Technical Reports should include, but not be limited to:


  • Executive Summary

  • Standard Setting Overview

  • Standard Setting Agenda

  • Training Overheads

  • Training Materials

  • Evaluation Results

  • Median Results and Impact Data

  • Graphical Presentations by Content and Grade

  • Standard Error Tables

  • Interpolated Cut Scores by Grade (Including performance levels)

  • Evidence for Procedural Validity

The vendor should provide samples of Standard Setting Technical Reports for committee review.


3.2.2.1.15 Statistical Analyses for Special Populations and Other Studies

Vendors may propose other appropriate studies in the event the RFP has not covered every eventuality. In addition, the Agency may, in consultation with the vendor, over the life of the contract, request additional studies to meet a particular need. If studies are suggested, the vendor must provide examples for committee review.


3.2.2.1.16 External Quality Control

The Agency and the vendor shall operate separate quality control operations. In so doing, the Agency may utilize the services of one, or more, vendors to assist in verification of the quality and accuracy of the statewide assessment results. If implemented, these vendors will work under the direction of the Agency and will perform data verification checks at times and places so designated. The vendor will be obligated to provide data, information, explanations, and workspace, if necessary, for Agency staff and data verification vendors working with the Agency. The objective of the quality control processes is to triangulate analyses and to verify the data being reported are correct. The Agency will review all quality control findings and will provide timely permission for the vendor to score and distribute test results. The vendor must provide examples of their quality control protocols and procedures for committee review.


3.2.2.1.17 Psychometric Support

A high level of communication is absolutely essential in the area of psychometric support. While not every element of this communication can be defined here, the Agency expects the utmost in consultation and availability of the vendor’s psychometric staff.


The vendor proposal must describe procedures to provide measurement consultation services to the Agency staff (and others) and describe procedures for providing measurement consultation in a collaborative manner. These support services may include problem-solving discussions, consultation with outside experts, participation in content/bias reviews, review of field test and operational data, exploratory analyses, unexpected measurement, or technical issues, or more formal studies. The successful vendor must be prepared to provide psychometric support to address these issues by working closely with the Agency to resolve all issues. Attendance of selected psychometric staff at West Virginia Technical Advisory Committee meetings is mandatory.
Under Section 3.2.4.6.5, vendors should provide strong evidence and identification of experienced key psychometricians, research and measurement specialists, data analysts, legal experts, and other key technical staff assigned to the program.
3.2.2.1.18 Technical Reporting

3.2.2.1.18.1 Analyses Reports

The vendor will be responsible for providing a hard copy and an electronic copy of all data analyses to the Agency. The vendor will also provide the data files and a record layout to the Agency. The electronic format and data layout will be mutually agreed upon by the vendor and the Agency at the first technical meeting.


3.2.2.1.18.2 Final Technical Reports/Documents

The vendor will be responsible for designing, writing, and producing Technical Reports to provide documentation of all technical work associated with each field and operational test. The content of the reports shall include detailed narrative descriptions of content and bias reviews, item review/selection, validity and reliability studies, scaling, and item statistics. These reports will provide sufficient information to allow for an independent evaluation of the quality of the assessments.


Following each field and operational administration, the vendor will produce both draft and final technical documents (incorporating recommended revisions) based on an overall analysis of the administration. The draft technical document will be reviewed by the Agency and by the Technical Advisory Committee (TAC) prior to completion of the final copy of the report. Recommendations and changes will be incorporated into the final draft.
Technical manuals should include the following sections. Others may be proposed, as the vendor deems necessary (and when applicable):

  • Purpose

  • Background

  • Overview of test design

  • Test development procedures employed to construct the test forms

  • Evidence of fairness in development of test

  • Test administration

  • Test scoring

  • Handscoring reliability and validity

  • Online reliability and validity

  • Rater effects

  • Description and analysis of sampling procedure

  • Description and analysis of calibration and equating

  • Model fit; local dependence

  • Scaling and equating procedures

  • Reliability and validity of individual and group scores

  • Reliability of classification decisions, including SEMs

  • Reliability of year-to-year changes in school means

  • Generalizability for all relevant sources, such as variability of groups and internal consistency of item responses

  • Description and analysis of equating procedure

  • Vertical scaling procedure

  • A comparison of the characteristics of the current test administration to previous administrations

  • Reports

  • Performance Standards

  • Quality Control Procedures

  • Glossary Of Terms

The Agency will work with the vendor to determine the final contents of the Technical Report and any supplements. The vendor will provide ten printed copies of the final report and an electronic version in the format determined by the Agency. The report will include tabular and graphic displays of data to illustrate the characteristics and quality of test scores. Hard copy reports must be professionally bound and labeled.


The Agency will work with the vendor on the delivery date for the technical manuals and documents, but in no case shall it be longer than 60 working days from written reports in the schools. The proposal requires the vendor to submit a sample Technical Report document for the committee’s review.
3.2.2.1.19 Materials Production

The vendor shall be responsible for developing and producing all test materials (e.g., test forms, answer documents, manuals, and header sheets, etc.) for the administration of the Agency’s annual tests in compliance with Agency printing specifications. Development and production should include, but is not limited to, writing, copy editing, proofreading, graphic design, layout, and print, or electronic publishing of documents.


The vendor will also print any additional materials needed to implement the project, such as transmittal memoranda, labels for packing, and packing lists. The vendor will be responsible for all aspects of production for publishing printed products, including formatting, graphics, and key entry.

For each publication, the vendor will submit for approval printing plans that identify type size and style, ink and paper color, paper quality, and layout. Printing examples that show type size and style will be included. The Agency desires attractive, high quality printed materials at reasonable cost.


See Table 12 and Table 13 in Section 3.2.2.1.6.3 for information regarding additional material specifications for a description of the test documents comprising the field test in Spring 2008 and the first operational test in Spring 2009. The vendor can expect the configurations in 2010 and beyond will be similar to those shown for 2009; however, the Agency reserves the right to change this configuration, as needed.
3.2.2.1.19.1 Test Booklets

Two WESTEST forms for grades 3-11, measuring the CSOs in content areas of Reading/Language Arts, Mathematics, Science and Social Studies are to be constructed. The forms will alternate, and the form not being used will serve as the breach form. Vendors shall propose a defensible plan for the final test form construction process which should include a description of the final documents. The Agency may choose to modify the design of the test booklets and answer documents prior to any test administration within the constraints of the materials specifications. The format of test booklets and answer documents for grades 3-11 will be dependent upon the final option selected by the Agency.


The Agency must receive a minimum of 10 copies of each test form and answer document before distribution to local counties. After distribution to counties is completed, all overage is property of the Agency. Each year, the vendor will solicit the specific quantities to be shipped to the Agency and quantities for disposition. The vendor will store used paper answer documents for the duration of the contract, and used test booklets for one calendar year from the date of retrieval.
The test booklets have the following general characteristics:

  • Different item types may be interspersed throughout the test books.

  • If required, blocks of embedded field test, or anchor items, may be located in various locations within test books.

  • Multiple field test forms will be produced and packaged for spiraled distribution in Spring 2008.

  • Depending upon the test option described, tests may include a combination of up to 85 multiple choice, constructed response, or gridded-response questions per content area for grades 3-11.

  • Tests for grades 3-11 will include a Writing Assessment.

  • Reading passages are published works for which copyrights are to be obtained.

  • Reading passages may be reproduced with extensive graphics and pictures. Universal Design Principles should be considered when selecting graphics and pictures. Mathematics, Science and Social Studies items may utilize graphics extensively. Graphics may also be included extensively in constructed response answer documents.

  • Universal Design Principles should be considered when selecting graphics and pictures.


3.2.2.1.19.1.1 Custom Covers

Test booklets should have customized covers. The cover must dovetail with a selected color scheme and have a similar look to test material from other WV assessment programs. The vendor may provide options to customize covers for APTA grades 3-11 to include the Agency logo for committee review.


3.2.2.1.19.2 Answer Documents

The answer document design may vary according to grade level depending on the option(s) selected by the Agency. The Agency requires each answer document for WESTEST grades 3-11 to be on quality paper, and each answer document should include certain student information across all grade levels and subject areas. This includes such items as name, WVEIS identification number, gender, race/ethnicity, date of birth, classification as to exceptional education services, or limited English proficiency, etc.
The answer documents shall also include appropriate fields for Agency and county use. Vendors should be aware that, in accordance with policy changes from the federal level, the Agency may alter the current design of the race/ethnicity fields to be in proper compliance with federal guidelines. The vendor must provide an example of answer documents for committee review.
The program requires an answer document for each student attempting the test. The end result will be production of a data file containing all students who were properly assessed and their results and invalid test scores. The Agency and the counties/schools will be provided with reports containing the results for their students.


Download 5.9 Mb.

Share with your friends:
1   2   3   4   5   6   7   8   9   ...   38




The database is protected by copyright ©sckool.org 2020
send message

    Main page