Request for Proposal (rfp)



Download 5.9 Mb.
Page9/38
Date09.08.2018
Size5.9 Mb.
TypeRequest
1   ...   5   6   7   8   9   10   11   12   ...   38

3.2.2.2.20.4 Test Examiner’s Manual


A separate manual will not be developed for APTA for the field test and for each operational year. The Test Examiner’s Manual must be incorporated into the Student Test Booklet.

3.2.2.2.20.5 County Test Coordinator’s Manual


A County Test Coordinator’s Manual will be developed for the field test and for each operational year. Manuals will contain coordinator-level instructions for handling and administering the appropriate grade level assessments. They also will include information about security of materials. The manual will be available in an electronic format accessible via the Agency’s website prior to the hardcopy distribution.
The vendor shall submit to Agency in a timely manner the final proofs of the scannable consumable test booklets along with the proofs of the County Test Coordinator’s Manual to ensure the testing instructions are consistent with the testing instruments. The vendor shall ensure manual proofs are free of typographical and format errors before they are submitted to the Agency. All manuals will be submitted to the same proofing and printing stages as the test booklets. The proposal requires the vendor to submit sample manuals for the committee’s review.
3.2.2.2.20.6 Other Ancillary Documents

The vendor will also print any additional materials needed to implement the project, such as transmittal memoranda, labels for packing and packing lists. These documents are to include, but are not limited to:

  • County Header

  • School Header

  • Answer Documents Envelopes

  • Labels

  • Materials Checklist

  • Memos

  • Security Checklists

All ancillary documents will be submitted to the same proofing and printing stages as the test booklets and answer documents. The vendor shall ensure manual proofs are free of typographical and format errors before they are submitted to the Agency. The vendor shall submit a schedule for reviews of the manuals and other ancillary materials.


3.2.2.2.20.7 Braille

For student level test documents, the vendor will provide sufficient quantities of Braille versions at each grade level. A publisher of Braille materials approved by the Agency will produce Braille versions of the scannable consumable test books and other documents at the vendor’s expense. The vendor will provide the electronic files in the correct format to Braille subcontractor(s). Test administrator notes and scripts accompanying the Braille test versions will also be developed by the vendor.


APHB guidelines (Refer to http://www.aph.org/) for Braille must be honored by the vendor. For Braille printed products, the vendor is responsible for having the materials proofed by an independent party. Assurances must be provided to Agency that all errors identified by the independent Braille proofreader were corrected. The Agency may also employ the services of internal, or external, Braille proofreaders. The vendor must provide a timeline for the review of all Braille materials.

3.2.2.2.20.8 Breach Forms


An appropriate quantity of breach forms must be available on an annual basis. There will need to be 5 test booklets at each grade level of the 2010 operational form available to be used as the breach form for the operational administration for 2009. In 2010, there will need to be 5 tests booklets at each grade level of the operational form from 2009 available to be used as the breach form. This rotational format will continue throughout the life of the testing contract.
3.2.2.2.20.9 Materials Distribution/Retrieval

Many forms and materials are needed to implement the APTA. Some of the materials listed in this section will assist schools, counties and the state in implementing quality control procedures and will ensure the integrity of the data collected by the program. The Agency also uses special forms to evaluate the quality of the assessment program and its implementation annually.


3.2.2.2.20.10 Scoring Materials

The specifics for scoring APTA include:



  • Technical services related to the production and interpretation of results

  • Technical assistance/psychometric services to combine results for reading subtest score, language arts subtest score and the writing score into a reading/LA score for grades 3-8 and 11

  • Plan to integrate WVEIS student information with the scoring system and exporting student scores and item response files back into the WVEIS files

  • Use of WVEIS unique student identifier

  • Plan to create output files including assessment data items as required by NCLB and Individuals with Disabilities Education Improvement Act of 2004 (IDEA)

  • Log in process for assessment materials

  • Conversions and statistical measure calculations process

  • Schedule to customize (if needed) scoring software

The vendor is responsible for all arrangements and costs associated with packing, distributing, and returning materials. Prompt and accurate delivery of materials is important to Agency and to local county personnel who have the responsibility of managing test materials. There must be 100 percent accountability of all test booklets and answer booklets returned by the counties using bar code labeling systems. The vendor must guarantee distribution procedures are satisfactory. Vendors’ proposals must include descriptions of the procedures they will use to complete these tasks.


3.2.2.2.21 Pre-identification of Scannable Consumable Test Booklets

The vendor will be responsible for pre-identifying scannable consumable test booklets using data transmitted via an electronic medium from the Agency for each administration. The vendor must establish a system to allow the Agency to transmit pre-identification data electronically over a secure data transmission network accessible only to authorized users. The vendor will bear the cost of establishing the system and providing network-specific software if needed by the Agency to access the system.


For the operational administrations, Agency will submit pre-identification files to provide estimated counts for print materials. A second file will be submitted for barcode labels for the scannable consumable test booklets. The vendor will then send labels (packaged by school) to arrive in districts two weeks before the test administration.

The vendor will establish a system to ensure the barcode labels delivered to counties contain data that are accurate, and are printed at a level of quality that permits accurate scanning and precludes the possibility of smudging. The vendor will provide the Agency with a checking program to be used before submitting data to the vendor to help ensure all data fields include acceptable data.


Vendors will provide quality control throughout the printing process to ensure the quality of label printing. Pre-identified labels will be packaged and labeled by school and shipped to counties as part of the shipment containing test books.
3.2.2.2.22 Optional Technology Systems

Vendors shall also propose solutions/examples and other technologies to permit barcode labels to be printed on dates closer to the actual spring test administration dates. Proposals for other technologies to pre-identify scannable consumable test booklets will be acceptable as long as the alternative provides at least the same level of timeliness, reliability, and accuracy as the labels. A pilot test of the vendor’s proposed system will be conducted for the Spring 2008 field test administration.


3.2.2.2.23 Pack and Distribute Materials

3.2.2.2.23.1 Packaging Specifications

The vendor will prepare packaging specifications to be delivered to the Agency four months before the field test and each operational administration. The packaging specifications will be updated as required to meet Agency needs. The packaging specifications will include the vendor’s procedures for packing and distributing materials to counties. The specifications will include a description of how the materials are packed, examples of packing and inventory lists for boxes to counties and schools, methods used for distributing materials. The Agency will, in collaboration with the vendor, develop procedures for return and processing of materials to Agency and the vendor will provide samples of procedures used for returning and processing materials..


3.2.2.2.23.2 Quantities

The Agency will provide the vendor with a list of the current names, addresses, email addresses, and phone and fax numbers of the County Test Coordinators. These persons will monitor all aspects of the assessment for their counties. Vendors must ship materials to county test coordinators at approximately 57 separate sites. The number of counties and special schools that serve as counties may change slightly during the life of the contract.


Table 20 below provides information about the estimated number of students to be the recipients of testing materials. Approximately 710 public/private/parochial school in 55 counties and 2 special districts will be tested. A list of the primary materials to be shipped for each administration, the quantities to be packaged for schools, counties, and the Agency, and other packaging specifications are given in Table 18 and 19.

Table 20: FIELD TEST DESIGN – 2007-2008

Number of Public Students Based on 2006-2007 Enrollment Figures for Grades 3-8 and 11 APTA



Grades

NUMBER OF STUDENTS


3

225

4

225

5

225

6

225

7

225

8

225

11

225

TOTAL

1575


3.2.2.2.23.3 List of County Test Coordinators

The vendor will be responsible for maintaining a list of County Test Coordinators and updating it as notified by the Agency. At the beginning of the contract the Agency will provide the vendor with a data file containing a list of the counties and schools (names and identification numbers), and the numbers of students by grade tested during the last year. The vendor will be responsible for maintaining and updating this list. All test information/materials go directly to county test coordinators, not schools.


3.2.2.2.23.4 Packing, Distributing, and Receiving Materials Provisions

The vendor’s proposal and final specifications for packing, distributing, and receiving materials will address the following provisions:



  • The Agency will provide to the vendor an electronic school population update file indicating the number of students anticipated to be tested in each school. The vendor and Agency will decide on the method used to determine the final number of students per school. This number will be the basis for determining the quantities of materials to be shipped for each school and county. The vendor will generate packing lists based on these numbers.

  • The order of schools within a county on all lists and for shipping purposes will be by county/school WVEIS identification number.

  • The vendor will assume all materials will be shrink-wrapped in quantities specified by the Agency for shipping. No box will weigh more than 30 pounds.

  • The vendor will label the boxes of test books with the message “TO BE OPENED ONLY BY COUNTY TEST COORDINATOR” and mark all boxes with special colors or labels so they can be easily identified as secure materials. School boxes will be labeled with the number of the county and school and the name of the school. Labels must be approved by the Agency. Only boxes directed to the county-level staff, such as boxes containing county overage, will be labeled with the name of the County Test Coordinator. The vendor will label boxes on the top and number boxes as “Box 1 of X,” Box 2 of X”, where X is the total number of boxes sent to that county. The box containing the packing information will be clearly identified for both school and county materials.

  • The vendor will pay charges on all materials shipped to the County Test Coordinators. The vendor will make arrangements for and pay for shipment if, due to a delivery error, the county is asked to ship materials to another county. The vendor must use an overnight delivery service for such shipments. Vendors will identify the procedures for delivery of testing materials. The Agency must approve all carriers.

  • Agency approval must be received before shipping printed products. Approval will be provided after the vendor and the Agency have received from the printer and have proofread examples of the printed products.

  • Materials will be packaged by school and sent to the County Test Coordinator in returnable boxes. The county will be responsible for distributing materials to the schools. The vendor is not responsible for any costs schools may incur in shipping test materials from their schools to the county office, unless the need to ship is the result of a packaging error by the vendor.

  • The Agency and counties will decide how materials overage will be distributed to the counties and the schools for each shipment. Overage remaining with the vendor must be shipped by the vendor by one-day service, if necessary.

  • The vendor will provide an online system for county test coordinator to order additional materials. Also, the vendor will staff toll-free phone and fax lines during the period in which materials are shipped, additional materials are ordered, and materials are picked up from schools and counties. The service will utilize one to three individuals, as needed, who are designated to respond only to West Virginia testing calls and perform other West Virginia work during this period. The vendor’s telephone lines must be staffed during Eastern Time Zone working hours.

  • It may be necessary for the vendor to ship the testing materials to the counties and Agency in up to four separate shipments per testing window – not including short shipments. Braille and large print materials are to be packaged and labeled separately and included in the shipment of test materials to counties.

  • The vendor will provide instructions for the materials being returned.

  • The vendor will be responsible for mailing or shipping by overnight delivery service or other means, as appropriate, any miscellaneous materials to the Agency and counties as situations arise. The vendor must secure the services of shippers who will provide inside delivery and unload large shipments onto loading docks.

  • The vendor must develop procedures to monitor the shipping and receipt of all materials and develop error logs. The date materials are received and who signed for the delivery must be documented. When problems arise, the vendor will be responsible for contacting the counties and the Agency concerning the problem and resolving the problem. The error logs will identify by school and county all failures to follow the established procedures and, if appropriate, how the errors were resolved. The error logs will be delivered to the Agency immediately after materials have been received in all counties.

  • After testing, using a numbering system, the vendor will verify, by number, all test books have been returned. The vendor will create a quality check to ensure each document will be matched for scanning. Beyond these measures, the vendor will, if needed, manually check answer documents to ensure 100% accuracy of check-in for secure documents. The vendor and Agency will resolve discrepancies in the numbers of returned test booklets.

  • The vendor will provide to the state, county and individual schools a security checklist or a similar procedure for tracking secure books.


3.2.2.2.23.5 Missing Materials Report and Inventory

The vendor will prepare a Missing Materials Report of secure scannable consumable test booklets based on the scanning completed during materials check-in. Reports will be prepared for each school with missing materials listing the number of test booklets missing and the identification of each. Any missing materials returned by counties will be recorded in the missing materials inventory maintained by the vendor. The missing materials reports must be delivered to the counties and the Agency 30 days after the check-in of secure materials has been completed. For each administration, check-in and verification of secure materials must be completed prior to the first shipment of results to counties. The vendor will deliver a final summary report of missing materials to the Agency.


3.2.2.2.23.6 Store/Retrieve Paper Scannable Consumable Test Booklets

The vendor must ship to the Agency quantities of these materials as may be required. In addition to the document retrieval specified, the vendor may periodically be required to retrieve from storage. The vendor will be responsible for costs associated with retrieval and possible delivery of these materials to the Agency. In addition, when errors are found, the vendor may be required to re-score and re-report these documents. The vendor must retain original answer documents. Copies must be organized to facilitate searching for a specific student response.


3.2.2.2.23.7 Disposition of Paper Material

Upon verification of the individual test booklet identification numbers returned by the counties and acceptance by the Agency of accurate results files, the vendor will inventory and store unused paper documents for a period of twelve months. After acceptance by the Agency of accurate computer files, used test booklets must be stored for the life of the program.

Unused answer documents may be destroyed after twelve months with written approval from the Agency. However, the vendor will store 10 copies of each grade test booklet for each administration throughout the life of the project. Any materials that may be used in subsequent assessments will be stored by the vendor. Test security requirements will be maintained throughout the destruction process.
At the end of the program, the vendor will ship or destroy the test booklets according to instructions from the Agency. This destruction will be initiated by a letter from the vendor to the Agency requesting permission to destroy specific materials. Destruction of secure documents must be requested in writing and authorized by the Agency. Further, the vendor must submit a certification of destruction that describes in writing the specific prompt/passages destroyed. If it is necessary to retain test booklets for a longer time period, the Agency will use additional funds to pay for the storage or request the documents be transferred to Agency for storage.
3.2.2.2.24 Processing and Scanning Booklets Verification Introduction

The vendor will design and implement the systems required to process and scan the results of student responses from each administration. The vendor will also develop procedures to verify the accuracy of data produced at each processing step. The vendor will work jointly with Agency on finalizing the processing rules and data analysis shortly after award. The vendor must deliver proposed requirements based on discussions and past performance history to Agency for approval.


Test processing will include receipt of the test booklets and ensuring the accuracy of data at each processing step. Additional data processing activities may include working with the Agency staff to edit accountability information and making corrections and additions to the data files as necessary.
3.2.2.2.24.1 Processing Specifications

The vendor must complete the development of all data processing, scanning, scoring, and reporting procedures prior to each test administration to ensure all procedures have been checked before the processing of student answer sheets begins. The vendor must monitor all aspects of the scanning and scoring procedures throughout the entire time actual answer documents are being scanned and scored.


The vendor is responsible for developing processing and scanning verification specifications for each administration that describes in detail all the steps to be implemented to demonstrate to the Agency the final reports of results are accurate. The vendor is responsible for drafting and revising the processing and scanning verification specifications and receiving Agency approval at least four months prior to each test administration. The components of the processing and scanning verification plan are as follows:

  • verifying total quantities returned by schools and counties

  • ensuring all pages are correctly ordered in scannable consumable test booklets

  • monitoring intensity levels read by each scanner

  • monitoring reading of answer documents, student bar codes, and other codes identifying the answer document

  • developing guidelines for hand edits

  • training staff to perform hand edits

  • monitoring hand edits


3.2.2.2.24.2 Verify Document Receipt

The data verification plan will begin with inventorying the used scannable consumable test documents received. The vendor must compare the number of used answer documents returned to the number on the header (test booklet count form) and compare the number returned to the number ordered by the school or county.


To assist in the process, county coordinators will document, to the vendor and to Agency, in a formal letter the destruction of any booklet. The information will include the booklet number and any other important identifying information and the reason for destruction. Secure test materials may not be destroyed without written permission of the Agency.
The Agency would prefer an electronic method for accomplishing this task that would generate a report showing the differences between the pre-identification file n-counts and the inventory of scanned returned answer documents by school within county. When a discrepancy is identified, the vendor will follow up and work with the county test coordinator and the Agency to resolve the discrepancy.
3.2.2.2.4.24.3 Scan Materials

Accurate scanning must be verified on each scanner used through the use of periodic recalibration procedures. Scanning must be monitored by the vendor between each scan run and each time a scanner is recalibrated. The vendor will provide a plan to identify what types of monitoring the vendor will be performing and what types of data will be presented to the Agency to verify the scanners are working properly through each scan run of actual scoring.



3.2.2.2.24.4 Materials Edited

Vendors shall provide a description of necessary editing of answer documents and headers which:



  • contain double grids or inaccurate gridding of printed information

  • are coded incorrectly with respect to student, school, or county identification

  • are deemed partially or wholly unscorable for some reason

  • other necessary materials edits



3.2.2.2.24.5 Disaster Recovery Plan


The vendor shall provide a description of the plan to backup all systems, applications, and databases routinely to an onsite and offsite location. Additionally, the vendor shall detail the plan for data recovery in the event a disaster is declared where the data is maintained and stored. Database transaction logs should be archived and maintained online for 48 hours.

3.2.2.2.25 Scoring and Technology Introduction


The proposal should set forth and document the capabilities of the company to score APTA scannable consumable test booklets. The vendor will prepare score reports for West Virginia within the prescribed time limits of six weeks. The references provided by the vendor in the proposal should be able to substantiate these capabilities. Agency desires to implement scoring processes that are reliable and valid, as well as efficient in terms of time and expenditures.
Vendors must design and implement systems that can process, score and report the results of student responses from each administration. Vendors must provide evidence of their ability to assign reliable and valid scores for the methods proposed. All student responses will be scanned and scored and there will be a 100% human read behind.
The Agency uses a combination scoring model with points for 1) the correct response to the item and 2) points assigned for the prompting level. The vendor will provide scores using the information that follows:

  • a prompting approach which defines the degree of assistance a student receives for example the student could independently access the question or the student could partially access the question or the student requires full prompting to answer the questions (Independent, Partial, Full).for both Multiple Choice and Constructed Response items. Independent receives 3 points for correct MC items and 5 to 6 points for mostly correct and fully correct CR items; Partial receives 2 points for correct MC items and 3 to 4 points for mostly correct and fully correct CR items; Full receives 1 point for correct MC items and 1 to 2 points for mostly correct and fully correct CR items.

Vendor must provide a detailed plan that identifies key production, distribution and scoring staff. Scoring accuracy is a key component in maintaining the quality and integrity of the APTA while meeting challenging scoring deadlines.


The specifics for scoring the APTA in grades 3-8 and 11 Reading/Language Arts, Mathematics, Science, and Social Studies include:

  • technical services related to the production and interpretation of results

  • detailed plan to integrate WVEIS student information with the scoring system and exporting student scores and item response files back into the WVEIS files

  • use of WVEIS unique student identifier

  • detailed plan to generate total score for Reading/LA, Mathematics, Science, and Social Studies from items

  • detailed plan to create output files including assessment data items as required by NCLB and Individuals with Disabilities Education Improvement Act of 2004 (IDEA)

  • log in process for assessment materials

  • conversions and statistical measure calculations process

  • schedule to customize (if needed) scoring software

Scoring requires trained and qualified human readers. All responses will be scanned and there will be a 100% human read behind. The vendor should describe the process for dealing with discrepant scores. The vendor is responsible for producing the following scoring materials for each operational and field test:



  • scoring guides

  • training sets

  • qualifying sets

  • validity sets

  • group discussion sets

  • recalibration set


3.2.2.2.26 Develop Specifications

For each administration, the vendor shall provide Agency with a detailed scoring and reporting plan that documents the accuracy of all scoring and report production programs. This plan should detail the process of quality reviews of data and the printed report output during report processing and live report production. The vendor will work jointly with Agency on finalizing the scoring rules and data analysis shortly after award.


Scoring verification steps should include the following:

  • developing procedures for the vendor and the Agency to independently verify the calibrating, scaling, and equating

  • verify that all items are equal in difficulty

  • verifying all items, which may include the level of prompting required for student to respond, are scored correctly

  • verifying all reader scores are correctly transferred to the student’s record

  • verifying the final scores on hand-scored tasks are correctly calculated

  • verifying all aggregated scores are correctly rounded and reported

  • arranging for consumable booklets to be scanned, scored, and all student, school, county, and state reports generated and proofed by the vendor and Agency

  • developing procedures and reports to identify duplicate student records within and across counties

  • detailed plan to create output files

  • required customization of scoring software to the WV designed point rubric for the prompt levels of the MC and CR items

  • a 100% read-behind is required by the Agency for this assessment


3.2.2.2.26.1 Verify Scoring

Necessary software will be created for the Agency and the vendor to independently verify the calibrating, scoring and equating. The vendor will provide all of the resources, including software, staff support, and data files to permit parallel calibration, scoring, and equating of data during the same time period they are producing operational scoring tables. Psychometric staff will be available for daily discussions and consultation throughout the parallel calibration periods.


The vendor must check the accuracy and consistency of all student level data on files before submitting the file to the Agency. This includes details such as ensuring all codes used on the final file are valid, all scores are accurately based on the students’ answers, all raw scores are aggregated correctly, all student demographic information is coded correctly, etc.
3.2.2.2.26.2 Report Verification

An independent verification of all files and each report generated by the vendor will be conducted by both the vendor and the Agency for a maximum of three counties. All of the files for each report are to be generated by the vendor and will be verified independently by the vendor and the Agency. The vendor should allow for up to eight staff members from the Agency to check these at either the vendor’s office or at a mutually agreeable site.


The final files and a copy of each report must be delivered to the Agency for data checking at least seven days prior to the mailing of the first shipment of reports to counties. The Agency will have no fewer than five working days to approve any individual file. If errors are identified on the files, additional time may be required for Agency review. Proposals may include any additional strategies that the vendor would recommend for consideration. The vendor must be prepared to regenerate files when errors are identified. The Agency will provide approval of all files before reports are printed and shipped.
3.2.2.2.26.3 Handscoring for APTA

Handscoring as used in this RFP refers to the scoring of the field test responses and the scoring processes necessary for this exam. The Agency desires to implement reliable and valid handscoring processes that are efficient in terms of time and expenditures. Therefore, vendors must provide evidence of their ability to assign reliable and valid scores for the methods proposed and also provide a detailed description of how the security of the student responses will be maintained throughout scoring.


A high level of hand scoring accuracy must be maintained while meeting challenging scoring deadlines. The vendor must utilize the resources and procedures needed to meet this requirement. Vendors will explain in detail in their proposals how the requirements of this section will be met. Student responses to field tests and operational tests will be scored by trained readers using online imaging technology. All student responses will be scanned and there will be a 100% human read behind.
The Agency will play an integral role in guiding and monitoring all aspects of training readers and scoring essay responses. The vendor will chair rangefinder review and selection meetings with input from Agency staff. Agency content staff will review and approve all final scoring materials, monitor the training of readers and the scoring sessions. Agency staff should be expected to be on-site throughout training of readers and during most of the handscoring. When not on site, Agency staff will need to have online access to all handscoring systems and reports and will communicate frequently with the vendor throughout the scoring process.
3.2.2.2.26.4 Produce Handscoring Specifications

The vendor is expected to incorporate the procedural, design, and implementation requirements for scoring scannable consumable test booklets into written specifications developed for field test administrations and operational administration.


The vendor will provide the option for the Agency to update handscoring specifications seven months prior to each spring administration. The handscoring process and procedures from the previous administration will be reviewed and updated as needed after each administration in order to improve the processes for the next administration. The handscoring specifications will be a detailed guide to conducting handscoring and be used by the vendor’s handscoring managers and the Agency. The specifications will, at a minimum, include the topics listed in the next sections.
3.2.2.2.26.5 Conduct Performance Scoring Operations

Scoring performance items requires a series of procedures designed to maintain the reliability and validity of examinee test scores, provide scoring reliability, quality control, and adequate management control. This RFP calls for the vendor to implement handscoring processes with a 100% read behind and procedures according to the Agency’s requirements. Enhancements to these processes are acceptable when approved by the Agency.



3.2.2.2.26.6 Produce Scoring Materials

The vendor will develop an electronic system for cataloging and storing all scoring materials developed during the course of the project. The vendor is responsible for producing the following scoring materials for each field test and operational test:



  • scoring guides

  • training sets

  • qualifying sets

  • validity sets

  • group discussion sets

  • recalibration sets

Agency staff will work closely with the vendor’s staff to prepare scoring materials. Meetings between Agency and vendor staff will be held following the rangefinder review meeting to initiate the development of scoring materials. All scoring procedures will be submitted to the Agency for review and approval.


Scoring materials must be approved at least three weeks prior to the beginning of training and scoring. The vendor will be responsible for developing a detailed schedule to be included in the handscoring specifications identifying steps in the development of scoring materials. At the completion of scoring, the vendor will provide the Agency with organized electronic copies of all scoring materials prepared for and utilized during scoring.
3.2.2.2.26.7 Handscoring Reports

Vendor will scan the scannable, consumable booklets to specifications so that information will be captured in a manner that will allow the vendor to generate the technical data that is needed to produce high quality reports. A subset of these reports, including the primary inter-rater reliability and validity reports will be available to the scoring director. The vendor should identify and describe the proposed reports used both externally and internally to monitor the quality and pace of the scoring session.


At the completion of field-test scoring and operational scoring, the vendor will provide the Agency with final copies of all cumulative hand scoring score reports. The summary handscoring reports are to be made available as electronic files on CD ROM. The vendor will produce a technical report that summarizes the score reports and provides details related to the reliability and validity of the field test and operational hand scoring procedures.
3.2.2.2.26.8 Scoring Student Responses

Program rubrics and scoring criteria are holistic in nature. The scoring rubrics for APTA will be the West Virginia three point multiple choice rubric and the six point constructed response rubric, which represent focused holistic scoring. All responses for field-test and operational constructed responses will be scanned and scored independently by a 100% human read behind.


3.2.2.2.26.9 Monitor and Maintain Hand Scoring Quality

Monitoring and maintenance procedures are intended to establish and maintain high levels of scoring accuracy. An important element of these features is they must result in quickly identifying individual readers failing to maintain acceptable scoring standards and using corrective strategies. The vendor must be prepared to utilize all procedures identified in this section. The vendor will also be expected to contribute additional ideas and procedures to monitor and maintain handscoring quality.


As part of the imaging and handscoring specifications for each administration, the vendor, in consultation with the Agency, will plan the combination of monitoring and maintenance procedures to most efficiently maintain the required high levels of scoring accuracy. The Agency will give final approval to these procedures:

  • Daily Systematic Review of Handscoring Reports

  • 100% Read Behinds

  • Scoring Validity Sets

  • Automatic Targeting

  • Targeted Validity Set Administration

  • Pseudo-Scoring

  • Group Retraining

  • Individual Conferencing

  • Dismissal (The vendor will dismiss readers who fail to perform satisfactorily following retraining.)


3.2.2.2.26.10 Handscoring Personnel

All project directors, scoring directors, team leaders, and readers must have earned a bachelor's degree. All personnel must sign an agreement with the Agency they will maintain the security of materials in addition to security agreements required by the vendor. The vendor must describe their screening process for hiring personnel associated with scoring. To be hired as a reader, trainees are required to meet established standards. A reader must maintain a minimum 70 percent perfect agreement.


3.2.2.2.26.11 Scoring/Reporting Project Director for Handscoring

The vendor will assign its most qualified scoring staff to direct scoring for West Virginia’s responses. They must have an appropriate educational background and extensive experience in directing state-level performance projects as members of the vendor’s regular scoring staff. All scoring directors must have worked in scoring director roles for the vendor on a regular, continuing basis. The vendor will appoint a project director to serve as the vendor’s overall director for the project.


The project director must be available on a daily basis to discuss issues with site scoring directors and the Agency either in person or via phone, email, or fax throughout the training and scoring sessions. The site scoring directors will be on site throughout the training and scoring sessions and will personally assist scoring directors during the training of team leaders and readers and throughout the scoring sessions.
The scoring directors must participate in the rangefinder review and selection meetings. The scoring director for each grade will work to ensure the rangefinder selection and the ongoing direction of operational scoring are both conducted at the highest levels of quality.

For each assessment, the project and site scoring directors will conduct training for scoring directors with the assistance of a Agency staff after the completion of the rangefinder review meeting.


3.2.2.2.26.12 Scoring Leaders

Team leaders must go through the same screening process as readers. The Agency requires that team leaders have previous experience as readers and as team leaders if at all possible. At a minimum, team leaders must be experienced readers and be degreed in the assigned content area.


3.2.2.2.26.13 Recruit and Hire Readers

Vendors will include an analysis of the number of people that must be recruited, hired and subsequently qualified as readers to complete the scoring within the time required to return reports to counties by the dates designated in the project schedule. The vendor will describe the number of team leaders needed for each grade. This detailed analysis must be completed in the proposal for the 2008 field test and 2009 operational administrations.


3.2.2.2.26.14 Training and Qualifying of Readers

The vendor should conduct separate training session for each prompt for each grade level of APTA essay responses. The vendor will determine the training and qualifying of readers. The scoring director will conduct training with the assistance of team leaders, under the direction of site scoring directors.


The purpose of the training is to ensure each person who scores responses has met the Agency's standards for scoring. The training process is essential for ensuring scores assigned to student responses provide valid and reliable information. The vendor is responsible for developing training procedures in consultation with the Agency and the Agency will have final approval on all training techniques
At the conclusion of training, qualified readers will be taught how to use the Agency’s alert system to identify students whose responses indicate the need for an outside agency’s intervention.
3.2.2.2.26.15 Scoring Site

APTA scoring must be conducted at a vendor’s established scoring site that draw on the vendor’s most experienced pool of readers who participate in scoring activities on a regular basis throughout the calendar year.


The vendor’s proposal will identify the location of the proposed scoring site. The number of sites may not exceed one for handscoring APTA without Agency approval. The Agency reserves the right to approve scoring sites and the distribution of grade scoring across sites.
At the Agency’s option, observers may be allowed access to the scoring centers for brief periods of time for the purpose of generally understanding the process. Agency official or vendor staff designated by the Agency will accompany such visitors.
3.2.2.2.26.16 Expedite Performance Scoring

As stated, the Agency desires that the tests be scored and reported in the most expeditious manner possible (six weeks scoring requirement with results to be posted on FTP site). Vendors are expected to consider alternatives to make it possible for the statewide assessments to be processed according to a timeline shorter than the one specified. The Agency must approve any alternatives the vendor proposes. Security is of the utmost importance and any proposed scoring solution must provide security guarantees. For purposes of this proposal, all vendors must submit proposals that meet the common minimum approach set forth in the requirements of this RFP.


3.2.2.2.26.17 Overall Scoring Quality Control

The vendor shall provide quality control systems to verify the accuracy of the scoring, processing, and reporting. In addition, the vendor will provide the results of these quality control reviews to the Agency so that the Agency can ensure any identified problems have been rectified. In addition, the Agency may operate separate quality control operations. In so doing, the Agency may utilize the services of one or more vendors to assist in verification of the quality and accuracy of the statewide assessment results. These vendors will work under the direction of the Agency and will perform data verification checks at times and places so designated. The vendor will be obligated to provide data, information, explanations, and work space, if necessary, for data verification vendors working with the Agency.


The objective of the quality control processes is to triangulate analyses and verify the data being reported are correct. The Agency will review all quality control findings and will provide permission for the vendor to prepare and distribute test results.
3.2.2.2.27 Reports Descriptions

Vendor's proposed timeline for completing proposed services must respect the Spring 2008 field test and 2009 operational test administration dates. All reports are to be separated before being shipped to counties or to the Agency.


In addition, the vendor will be prepared to process missing or erroneous reports throughout the duration of the contract. Copies of data files for each test administration shall be maintained throughout the duration of the contract. Three distribution levels will be specified - school, county, and state. School reports will be shipped to the County Test Coordinator at each county office.
The following is a description of the reports desired by Agency. The APTA reports for the program must include:
1. STUDENT REPORT:

The individual Student Report will at a minimum include the following:



    • Student name, grade, date of birth, WVEIS #, class, school, county, state, and explanatory information about the scores

    • Total summative score and performance level

      • Writing performance, based on the constructed response prompt, for all students of the same grade level must be reliability combined into each student’s Reading/Language Arts score

    • Definition of terms

    • Performance level descriptors

      • One side of student report will capture the performance level descriptors


2. INDIVIDUAL ITEM ANALYSIS (Per Student):

The Individual Item Analysis shall at a minimum provide for each student



  • Listing of all content standards and objectives for all items on the test

    • All objectives must be clustered under appropriate standards

  • Indication response to the item is correct, indication response to the item is not correct for multiple choice items

  • Indication of the number of points possible, points received by student for constructed response items

  • Thinking skills levels by item/definition of thinking skills

  • Definitions for all unfamiliar terms on the report


3. ITEM ANALYSIS SUMMARY BY SUBGROUP REPORT (School, County and State)

The Item Analysis Summary by subgroup report shall be organized by grade by school, by grade by county, and by county by state and shall include at a minimum the following:



  • Results disaggregated by All, low Socio-economic Status, Special Education, Black, White, Hispanic, Asian/Pacific, Native American/Alaskan, Limited English Proficient, for grades K – 11

  • Content area, item number, CSO number and description

    • All objectives must be clustered by standards for each content area

  • Percent of students with item correct, percent of students with each score point, number of points possible, thinking skills levels by item

  • Definition of thinking skills level

  • Definition of all unfamiliar terms on the report


4. CONFIDENTIAL ROSTER REPORT (ALPHA Level by Grade for School Level Only):

The Confidential Roster Report shall be organized by grade by school and by grade by county and shall include the following:



  • Student names in alphabetical order by grade (last name, first name, middle initial), grade, date of birth, WVEIS number, school, county, test date, prompt type

  • Student scale scores for all content areas and performance levels,

  • All content areas should have assessment scores and performance levels

    • Show combined Reading/Language Arts/Writing Assessment scores and separate writing score and performance level


5. CONFIDENTIAL SUMMARY REPORTS (School, County and State):

The West Virginia Confidential Summary Report shall be prepared for all schools, counties, and state. They shall contain a graph of the percent of students who attained each performance level category. The report shall also show the distribution of analytic trait scores for the group. The report shall be organized by grade by school, by grade by county, and by grade by state and shall at a minimum include the following:




  • Grade and test date

  • Number of students tested by content areas and subgroups (all, gender, race/ethnicity, students with disabilities, Limited English Proficient students, migrant, economically disadvantaged)

  • Performance levels by aggregate number and percent of students at each performance level

  • Mean scale scores and grade level mastery of the content area

  • Number of students tested by content standard, grade level mastery of content standards, and mean percent correct by content standard

  • Definition of all unfamiliar terms on the report

  • Writing performance levels for all students of the same grade level must be reliability combined into the Reading/Language Arts score

  • Number of students tested by content standard, grade level mastery of content standards, and mean percent correct

  • Definition of mean percent correct

Upon all corrections being made with vendor and Agency, the vendor will supply



        • all corrected reports to Agency via an FTP site within 8 weeks of receipt and scoring of assessments and

        • the aggregate performance, as per the guidelines of this Confidential Summary Report, of all grade levels within the each school, each county and the state.


6. GENERAL RESEARCH FILE:

A report shall be programmed and made available to provide electronic data for preparing accountability reports. This file shall be organized by school, grade, county, and state, subgroups, and shall agree with the data reported on summary lines in the school, county, and state level reports.


7. STUDENT LABEL

    • The Student Label will at a minimum include the following:

      • Student name, grade, school, test date, gender, date of birth, WVEIS number, content area, scale score, performance level by grade by school

      • Self-adhesive to allow attachment to the student record


8. ELECTRONIC DATABASE PROGRAM OF STUDENT PERFORMANCE

    • Individual Student Performance with a re-rostering option for teachers

      • Reports by subgroup to include Title I and Special Education

and all other federally required subgroups

      • Capability of producing longitudinal data reports

      • Program should be customized to address the reports required in this proposal

      • Product may be software or web-based

      • Product should be compatible with software programs

This section requires the vendor to provide costs for placing all reports except the student report on a secure FTP site for local school districts for electronic retrieval and printing. All individual student reports are to be printed by the vendor and distributed to county test coordinators for dissemination to local schools. The vendor must supply two copies of each individual student report to the local school district.
Note: Any proprietary software required (along with all software support) to read the data must be included for the Agency and updated throughout the contract. Vendors are to fully describe this software. If additional copies will be required at the county level, pricing for this must be included for the life of the contract.
This section requires the vendor to provide costs for placing all reports except the student report on a secure FTP site for local school districts for electronic retrieval and printing. All individual student reports are to be printed by the vendor and distributed to county test coordinators for dissemination to local schools. The vendor must supply two copies of each individual student report to the local school district.
3.2.2.2.28 Sample Reports

The vendor is responsible for providing examples of each report. The vendor must provide their procedures for quality reviews of data generated onto the reports. The vendor will work jointly with Agency on the final reporting considerations shortly after award.


3.2.2.2.29 Report Development

The vendor and Agency will extensively review all data files before they are used to produce live reports. The vendor must produce a live data file with a sample population composed of at least three counties selected by the Agency. This file will be used to check student-level and aggregated data for each grade. Each phase of reports will be created from this live data check file; both the file and reports shall be sent to Agency for verification and approval.


Agency will review the data and draft reports and will work with the vendor to resolve any questions. Agency expects the vendor to conduct an extensive quality check before the file and final reports are sent to Agency. The vendor will not provide individual student data or reports for the field test administration.
3.2.2.2.29.1 Update Report Designs

The vendor is responsible for annually reviewing and updating the design of the individual student, school, county, and state reports of test results in consultation with the Agency. Though it is expected report formats will not change extensively from year to year, the vendor should, after each administration, pursue reporting requirements from Agency and make any changes required by Agency until final approval is given. No extra cost may be charged to Agency.


3.2.2.2.29.2 Report Delivery

During each administration, numerous reports and data files are provided for students, parents, schools, counties, the state, and the general public with data aggregated in various ways. The actual reports and data files to be generated are described in Section 3.2.2.2.27. The vendor must prepare the data files using formats approved by the Agency. In addition to reports of results, there are also additional reports, including, missing secure materials reports, etc. Requirements are established for many reports to be available as electronic files in formats compliant with Section 508 of the Rehabilitation Act (Refer to: http://www.section508.gov/) and to allow the files to be both viewed on a website and downloaded files.


3.2.2.2.29.3 Report Phases/Timelines

The vendor will not provide individual student data or reports for the field test administration.


Following vendor quality checks, reports for all test administrations spring test shall be delivered to the counties on a secure FTP site with a county folder and local school folders within five weeks after the vendor receives the tests if a multiple choice only option is selected and within eight weeks if constructed response items are selected. All student reports will be shipped to local counties in hardcopy after the vendor receives and scores the tests. Agency expects the vendor to distribute reports via the secure FTP site. The vendor and Agency will develop a process by which school files can be loaded onto the FTP site in PDF format.
All student reports for schools are to be packaged by schools, but sent to the County Test Coordinators. All printed products will be proofed by the vendor and copies will be sent to the Agency for proofing and approval prior to mailing any product to the counties.
The Agency must review at least three county reports before shipment of any reports. All student reports should be original laser copies. The vendor shall be responsible for maintaining copies of electronic data files for each test administration
The Agency reserves the right to request some records be removed from processing until specific issues are resolved. These issues include duplicate records, records with blank WVEIS Student Identification Numbers and/or blank names, schools or students whose test records are under investigation for possible cheating, or other issues that might affect school totals. The issues regarding the suppressed records will be dealt with as soon as possible after reporting is completed.
The vendor may be requested to change the score reported flag on the file to one that would not report the student’s score, pull test documents to resolve duplicate tester issues, add a corrected Student Identification Number or corrected name to a record, produce Individual Student Reports (as directed), and/or School Lists of Students. The vendor will work with the Agency to establish a timeline for the processing and reporting of these records.
3.2.2.2.30 Electronic Records

For each administration, the vendor will supply the Agency with an electronic file, in a format approved by the Agency, containing data aggregated by grade and subject for each school, county, and the state as per the Section 3.2.2.16 Reports Descriptions. These electronic records will agree with the data reported on summary lines in the county, state, and school level reports. Additional summary statistics for each school, county and the state will be reported by disaggregated characteristics such as racial/ethnic group, gender and other demographic information. Every summary statistic printed in the paper reports should be represented in this file.


The vendor will be responsible for checking to ensure all files are consistent and accurately reflect the data provided on the reports. The Agency will independently verify the consistency and accuracy of the data files. The Agency retains the right to attend vendor’s reports generation site or at a mutually agreeable site.
3.2.2.2.31 Reports Descriptions/Timelines

Vendor's proposed timeline for completing proposed services must respect the Spring 2008 Field Test and 2009 Operational Test administration dates. All student reports are to be separated by school before being shipped to counties. In addition, the vendor will be prepared to process missing or erroneous reports throughout the duration of the contract. Copies of data files for each test administration shall be maintained throughout the duration of the contract. Three distribution levels will be specified - school, county, and state. School reports will be shipped to the county test coordinator at each central office.


3.2.2.2.32 Optional Reporting Services

The Agency requests that each vendor provide costs/quotes for any optional services, enhancements or projected updates in the proposal. Please provide information about product efficiency, usability, expanded reporting capabilities and costing. These optional services must be available upon request by the Agency.


3.2.2.2.33 Disaster Recovery Plan

The vendor shall provide a description of the plan to backup all systems, applications, and databases routinely to an onsite and offsite location. Additionally, the vendor shall detail the plan for data recovery in the event a disaster is declared where the data is maintained and stored. Database transaction logs should be archived and maintained online for 48 hours.





Download 5.9 Mb.

Share with your friends:
1   ...   5   6   7   8   9   10   11   12   ...   38




The database is protected by copyright ©sckool.org 2020
send message

    Main page