Request for Proposal (rfp)



Download 5.9 Mb.
Page14/38
Date09.08.2018
Size5.9 Mb.
TypeRequest
1   ...   10   11   12   13   14   15   16   17   ...   38

3.2.2.4.16.2 Test Examiner Manual


The vendor shall develop, print, and distribute a Test Examiner Manual which will contain instructions for an administration for the appropriate grade levels, general information about how to conduct the assessment and specific test instructions. They will also include information about security of materials. The manual will be available in an electronic format accessible via the Agency’s website prior to the hardcopy distribution.
The vendor shall submit to Agency in a timely manner the final proofs of the answer documents along with the proofs of the Examiner Manual to ensure the testing instructions are consistent with the testing instrument.

3.2.2.4.16.3 County Test Coordinator Manual


A County Test Coordinator Manual will be developed, printed, and distributed each year. Manuals will contain coordinator-level instructions for handling and administering the appropriate grade-level assessment. They also will include information about accommodations and security of materials. The manual will also be available in an electronic format accessible via the Agency’s website.
The vendor shall submit to Agency in a timely manner the final proofs of the Examiner Manuals and the County Test Coordinator Manual to ensure the testing instructions are consistent with the testing instruments. All manuals will be submitted to the same proofing and printing stages as the answer documents. The vendor shall ensure manual proofs are free of typographical and format errors before they are submitted to the Agency.
3.2.2.4.16.4 Other Ancillary Documents

The vendor will also print any additional materials needed to implement the project, such as transmittal memoranda, labels for packing and packing lists. These documents are to include the following:



  • County

  • School Header

  • Answer Book Envelopes

  • Labels

  • Materials Checklist

  • Memos

All ancillary documents will be submitted to the same proofing and printing stages as the test booklets and answer documents. The vendor shall ensure manual proofs are free of typographical and format errors before they are submitted to the Agency. The vendor shall submit a schedule for reviews of the manuals and other ancillary materials.


3.2.2.4.16.5 Braille and Large Print Documents

For student-level answer documents, the vendor will provide sufficient quantities of Braille and large print versions at each grade level for visually impaired students. A publisher of Braille and large-print materials approved by the Agency will produce the large-print and Braille versions of the answer documents at the vendor’s expense. The vendor will provide the electronic files in the correct format to Braille and large print subcontractor(s). Test administrator notes and scripts accompanying the Braille test versions will also be developed by the vendor.


For Braille printed products, the vendor is responsible for having the materials proofed by an independent party. The Agency may also employ the services of a Braille proofreader. Large-print documents will be printed in a minimum of 18-point type on approved paper no larger than 14” x 17” and bound using agreed upon methods. All Braille materials must meet specifications from the American Printing House for the Blind. The Agency retains the right to require reformatting of documents to meet the new specifications.
3.2.2.4.16.6 Breach Forms

If a breach occurs, a prompt/passage will be assigned to the student(s) for the current year; the prompt will be of a different genre. An appropriate quantity of breach forms must be available on an annual basis. There will need to be 100 forms per grade level of each genre provided to the Agency for dissemination for 2008-2009. The same will occur for 2009-2010 and after 2010 the prompt/passages will rotate. If the online option is selected, the vendor must have the capability to invalidate a student’s response and score and assign a new prompt/passage of a different genre. The reassignment must be made within 24 hours of notification by the Agency.


3.2.2.4.16.7 Responses to Writing Prompts

The vendor must retain original answer documents for both paper-based and online solutions. Files must be organized to facilitate searching for a specific student response. Files must be accessible such that an individual essay can be printed when needed.


3.2.2.4.17 Materials Distribution/Retrieval

Many forms and materials are needed to implement the Grades 3-11 Writing Assessment. Some of the materials listed in this section will assist schools, counties and the state in implementing quality control procedures and will ensure the integrity of the data collected by the program. The Agency also uses special forms to evaluate the quality of the assessment program and its implementation annually.


3.2.2.4.17.1 Scoring Materials:

The specifics for scoring the Writing Assessment include:



  • Technical services related to the production and interpretation of results

  • Technical assistance/psychometric services to combine results for reading subtest score, language arts subtest score and the writing score into a Reading/LA score for grades 3-11.

  • Plan to integrate WVEIS student information with the scoring system and exporting student scores and item response files back into the WVEIS files

  • Use of WVEIS unique student identifier

  • Plan to create output files including assessment data items as required by NCLB and Individuals with Disabilities Education Improvement Act of 2004 (IDEA)

  • Log in process for assessment materials

  • Conversions and statistical measure calculations process

  • Schedule to customize (if needed) scoring software

The vendor is responsible for all arrangements and costs associated with packing, distributing, and returning materials. Prompt and accurate delivery of materials is important to Agency and to local county personnel who have the responsibility of managing test materials. There must be 100 percent accountability of all test booklets and answer booklets returned by the counties using bar code labeling systems. The vendor must guarantee distribution procedures are satisfactory. Vendors’ proposals must include descriptions of the procedures they will use to complete these tasks.


3.2.2.4.17.2 Pre-identification of Answer Documents

The vendor will be responsible for pre-identifying answer documents using data transmitted via an electronic medium from the Agency for each administration. The vendor will assume pre-identification will be used for 100 percent of the school population. The vendor must establish a system to allow the Agency to transmit pre-identification data electronically over a secure data transmission network accessible only to authorized users. The vendor will bear the cost of establishing the system and providing network-specific software if needed by the Agency to access the system.


For field test, pilot, and operational administrations, Agency will submit pre-identification files to provide estimated counts for print materials. A second file will be submitted for labels and pre-identified answer documents. The vendor will then send labels (packaged by school) to arrive in counties two weeks before the test administration. Counties will have the option of requesting pre-identified answer documents to be packaged alphabetically by school, class, or other groupings as defined by the order in which student records are supplied on data files.

The counties will have the option of specifying the sort order for the pre-identified answer documents/labels. The vendor will establish a system to ensure the pre-identified answer documents/labels delivered to counties contain accurate data, reflect the options selected by counties, are accurate, and are printed at a level of quality that permits accurate scanning and precludes the possibility of smudging. The vendor will provide the Agency with a checking program to be used before submitting data to the vendor to help ensure all data fields include acceptable data.


Vendors will provide quality control throughout the printing process to ensure the quality of label printing. Pre-identified labels will be packaged and labeled by school and shipped to counties as part of the shipment containing test books and answer books.
3.2.2.4.17.3 Optional Technology Systems

Vendors shall also propose solutions and other technologies to permit the pre-identified answer documents/labels to be printed on dates closer to the actual spring test administration dates. Proposals for other technologies to pre-identify answer books are will be acceptable as long as the alternative provides at least the same level of timeliness, reliability, and accuracy as the labels. A pilot test of the vendor’s proposed system will be conducted for the Spring 2008 field test administration.


3.2.2.4.17.4 Pack and Distribute Materials

3.2.2.4.17.4.1 Packaging Specifications

The vendor will prepare packaging specifications to be delivered to the Agency four months before the field test and each operational administration. The packaging specifications will be updated as required to meet Agency needs. The packaging specifications will include the vendor’s procedures for packing and distributing materials to counties and receiving the materials from the counties. The specifications will include a description of how the materials are packed, examples of packing and inventory lists for boxes to counties and schools, methods used for distributing and returning materials and a description of the procedures used to inventory materials as they are returned.


3.2.2.4.17.4.2 Quantities

The Agency will provide the vendor with a list of the current names, addresses, email addresses, and phone and fax numbers of the county test coordinators. These persons will monitor all aspects of the assessment for their counties. Vendors must ship materials to county test coordinators at approximately 57 separate sites. The number of counties and special schools that serve as counties may change slightly during the life of the contract.


The Agency estimates that testing materials will be provided to approximately 710 schools; the number of students per grade level is listed Table 29. A list of the primary materials to be shipped for each administration, the quantities to be packaged for schools, counties, and the Agency, and other packaging specifications are given in Table 35 and Table 36.
3.2.2.4.17.5 List of County Test Coordinators

The vendor will be responsible for maintaining the list of county test coordinators and updating it as notified by the Agency. At the beginning of the contract the Agency will provide the vendor with a data file containing a list of the counties and schools (names and identification numbers), and the numbers of students tested by grade during the last year. The vendor will be responsible for maintaining and updating this list.


3.2.2.4.17.6 Packing, Distributing and Receiving Materials Provisions

The vendor’s proposal and final specifications for packing, distributing, and receiving materials will address the provisions outlined in Table 35 and Table 36.


3.2.2.4.17.7 Missing Materials Report and Inventory

The vendor will prepare a Missing Materials Report of Secure Answer Documents based on the scanning completed during materials check-in. Reports will be prepared for each school with missing materials listing the number of test booklets and answer documents missing and the identification of each. Any missing materials returned by counties will be recorded in the missing materials inventory maintained by the vendor. The missing materials reports must be delivered to the counties and the Agency 30 days after the check-in of secure materials has been completed. For each administration, check-in and verification of secure materials must be completed prior to the first shipment of results to counties. The vendor will deliver a final summary report of missing materials to the Agency.


3.2.2.4.17.8 Store/Retrieve Paper Answer Documents

The vendor must ship to the Agency quantities of these materials as may be required. In addition to the document retrieval specified, the vendor may periodically be required to retrieve answer documents from storage. The vendor will be responsible for costs associated with retrieval and possible delivery of these materials to the Agency. In addition, when errors are found, the vendor may be required to re-score and re-report these documents. The vendor must retain original answer documents. Copies must be organized to facilitate searching for a specific student response. Files must be accessible such that an individual essay can be printed when needed.


3.2.2.4.17.9 Disposition of Paper Materials

Upon verification of the individual answer document identification numbers of all answer documents returned by the counties and acceptance by the Agency of accurate results files, the vendor will inventory and store unused paper documents for a period of twelve months. After acceptance by the Agency of accurate computer files, answer documents must be stored for the life of the program.

Unused answer documents may be destroyed after twelve months with written approval from the Agency. However, the vendor will store 100 copies of each grade answer document for each administration throughout the life of the project. Any materials that may be used in subsequent assessments will be stored by the vendor. Additionally, after twelve months, the vendors and print vendors should destroy electronic files and print copies according to test security requirements recommended by the vendor and approved by the Agency. Test security requirements will be maintained throughout the destruction process.
At the end of the program, the vendor will ship or destroy the answer documents according to instructions from the Agency. This destruction will be initiated by a letter from the vendor to the Agency requesting permission to destroy specific materials. Destruction of secure documents must be requested in writing and authorized by the Agency. Further, the vendor must submit a certification of destruction that describes in writing the specific prompt/passages destroyed. If it is necessary to retain answer documents for a longer time period, the Agency will use additional funds to pay for the storage or request the documents be transferred to Agency for storage.
3.2.2.4.18 Processing and Scanning Verification

The vendor will design and implement the systems required to process and scan the results of student responses from each administration. The vendor will also develop procedures to verify the accuracy of data produced at each processing step. The vendor will work jointly with Agency on finalizing the processing rules and data analysis shortly after award. The vendor must deliver proposed requirements based on discussions and past performance history to Agency for approval.


Test processing will include receipt of the answer documents and ensuring the accuracy of data at each processing step. Additional data processing activities may include working with the Agency staff to edit accountability information and making corrections and additions to the data files as necessary.
3.2.2.4.18.1 Processing Specifications

The vendor must complete the development of all data processing, scanning, scoring, and reporting procedures prior to each test administration to ensure all procedures have been checked before the processing of student answer sheets begins. The vendor must monitor all aspects of the scanning and scoring procedures throughout the entire time actual answer documents are being scanned and scored.


The vendor is responsible for developing processing and scanning verification specifications for each administration that describes in detail all the steps to be implemented to demonstrate to the Agency the final reports of results are accurate. The vendor is responsible for drafting and revising the processing and scanning verification specifications and receiving Agency approval at least four months prior to each test administration. The components of the processing and scanning verification plan are as follows:

  • verifying total quantities returned by schools and counties

  • ensuring all pages are correctly ordered in answer documents

  • monitoring intensity levels read by each scanner

  • monitoring reading of answer documents, student bar codes, and other codes identifying the answer document

  • developing guidelines for hand edits

  • training staff to perform hand edits

  • monitoring hand edits


3.2.2.4.18.2 Verify Document Receipt

The data verification plan will begin with inventorying the used answer documents received. The vendor must compare the number of used answer documents returned to the number on the header (answer document count form) and compare the number returned to the number ordered by the school or county.


To assist in the process, county coordinators will document, to the vendor and to Agency, in a formal letter the destruction of any test booklet or answer document. The information will include the booklet number and any other important identifying information and the reason for destruction. Secure test materials may not be destroyed without written permission of the Agency.
The Agency would prefer an electronic method for accomplishing this task that would generate a report showing the differences between the pre-identification file n-counts and the inventory of scanned returned answer documents by school within county. When a discrepancy is identified, the vendor will follow up and work with the county test coordinator and the Agency to resolve the discrepancy.
3.2.2.4.18.3 Scan Materials

Accurate scanning must be verified on each scanner used through the use of periodic recalibration procedures. Scanning must be monitored by the vendor between each scan run and each time a scanner is recalibrated. The vendor will provide a plan to identify what types of monitoring the vendor will be performing and what types of data will be presented to the Agency to verify the scanners are working properly through each scan run of actual scoring.



3.2.2.4.18.4 Materials Edited

Vendors shall provide a description of necessary editing of answer documents and headers which



  • contain double grids or inaccurate gridding of printed information

  • are coded incorrectly with respect to student, school, or county identification

  • are deemed partially or wholly unscorable for some reason

  • other necessary materials edits


3.2.2.4.19 Scoring and Technology

The proposal should set forth and document the capabilities of the vendor to score West Virginia materials within the prescribed time limits. The references in the proposal should be able to substantiate these capabilities. The Agency desires to implement scoring processes that are reliable and valid, as well as efficient in terms of time and expenditures.


Vendors must design and implement systems, based on imaged-based scoring and artificial intelligence scoring, that can process, score and report the results of student responses from each administration. Vendors must provide evidence of their ability to assign reliable and valid scores for the methods proposed. All student responses will be scored using the West Virginia Six-Point Rubric. Each response will be assigned an analytic score in each of the criteria areas of Organization, Development, Sentence Structure, Word Choice, and Mechanics. A Summative Score and a Performance Level will be assigned based on the summary of the scores in these criteria areas. Vendors must provide a detailed description of how the security of the prompts and student responses will be maintained throughout scoring. Vendor must provide a detailed plan that identifies key production, distribution and scoring staff. Scoring accuracy is a key component in maintaining the quality and integrity of the Writing Assessment while meeting challenging scoring deadlines.
The specifics for scoring the Writing Assessment in grades 3–11 shall include the following:

  • technical services related to the production and interpretation of results

  • technical assistance/psychometric services to combine results for reading, English, and writing into a Reading/Language Arts score

  • plans to integrate WVEIS student information with the scoring system and exporting student scores back into the WVEIS files

  • use of student WVEIS unique identifier

  • plan to create output files including assessment data items as required by NCLB and Individuals with Disabilities Education Improvement Act of 2004 (IDEA)

  • log in process for assessment materials

  • conversions and statistical measure calculations process

  • schedule to customize (if needed)scoring software

  • platform on which the programs run

  • process to administer breach forms

A number of methodologies are available for implementing imaged-based scoring including scoring responses at specially equipped sites, specially equipped remote sites, or distributing to individuals via an intranet. Responses can be scored via an artificial intelligence scoring system (Options # 1 - 6 are set forth in Table 27).


Image-based scoring requires trained and qualified human readers. All student responses would be read by a minimum of two trained readers. Discrepancies greater than one holistic score point would be resolved by an independent third reading. The vendor should describe the process for dealing with discrepant scores. For image-based scoring the vendor is responsible for producing the following scoring materials for each operational and field test:

  • scoring guides

  • training sets

  • qualifying sets

  • validity sets

  • group discussion sets

  • recalibration set

For options requiring artificial intelligence scoring, the computer engine must be trained with West Virginia’s Six-Point Rubric and with essays written by West Virginia students which were scored with the West Virginia Six-Point Rubric.


3.2.2.4.19.1 Develop Specifications

For each administration, the vendor shall provide Agency with a detailed scoring and reporting plan that documents the accuracy of all scoring and report production programs. This plan should detail the process of quality reviews of data and the printed report output during report processing and live report production. The vendor will work jointly with Agency on finalizing the scoring rules and data analysis shortly after award.


Scoring verification steps should include the following:

  • developing procedures for the vendor and the Agency to independently verify the calibrating, scaling, and equating

  • verify that all prompts are equal in difficulty

  • verifying all prompts are scored correctly

  • verifying all reader scores are correctly transferred to the student’s record

  • verifying the correct number of readers scored each response

  • verifying the final scores on hand-scored tasks are correctly calculated

  • verifying all aggregated scores are correctly rounded and reported

  • arranging for answer documents to be scanned, scored, and all student, school, county, and state reports generated and proofed by the vendor and Agency

  • developing procedures and reports to identify duplicate student records within and across counties

  • detailed plan to create output files

  • schedule to customize (if needed) the scoring software to use the WV Six-Point Rubric


3.2.2.4.19.2 Verify Scoring

Necessary software will be created for the Agency and the vendor to independently verify the calibrating, scoring, and equating of data. The vendor will provide all of the resources, including software, staff support, and data files to permit parallel calibration, scoring, and equating of data during the same time period they are producing operational scoring tables. Psychometric staff will be available for daily discussions and consultation throughout the parallel calibration periods.


The vendor must check the accuracy and consistency of all student level data before submitting the file to the Agency. This includes details such as ensuring all codes used on the final file are valid, all scores are accurately based on the students’ answers, all raw scores are aggregated correctly, all student demographic information is coded correctly, etc.
3.2.2.4.19.3 Cheating Detection

Vendors must propose additional analyses to help the Agency ensure the scores reported for each student are a valid representation of that student’s abilities. Methods should be proposed to flag student and/or school results that appear anomalous based upon comparisons to previous test results, other test results (e.g., WESTEST), or other student or school results. Vendor’s proposals must provide a complete description of their proposed analysis and include an example of a report generated as a result of this analysis.


3.2.2.4.19.4 Report Verification

An independent verification of all electronic files and each report generated by the vendor will be conducted by both the vendor and the Agency for a maximum of three counties. The vendor should allow for up to eight staff members from the Agency to check these at either the vendor’s office or at a mutually agreeable site.


The final electronic files and a copy of each report must be delivered to the Agency for data checking at least seven days prior to the mailing of the student reports to counties. The Agency will have no fewer than five working days to approve any individual file. If errors are identified on the files, additional time may be required for Agency review. Proposals may include any additional strategies that the vendor would recommend for consideration. The vendor must be prepared to regenerate files when errors are identified. The Agency will provide approval before reports are printed and shipped.
3.2.2.4.20 Handscoring Essays

Handscoring as used in this RFP refers to the scoring of the field test responses and the image-based scoring processes necessitated by Options # 3 and # 4. This process would determine the analytic traits rating and summative score of a student’s response on the Writing Assessment. The Agency desires to implement reliable and valid handscoring processes that are efficient in terms of time and expenditures. Therefore, vendors must provide evidence of their ability to assign reliable and valid scores for the methods proposed and also provide a detailed description of how the security of the prompts/passages and student responses will be maintained throughout scoring.


A high level of hand scoring accuracy must be maintained while meeting challenging scoring deadlines. The vendor must utilize the resources and procedures needed to meet this requirement. Vendors will explain in detail in their proposals how the requirements of this section will be met. Student responses to field tests and operational tests will be scored by trained readers using online imaging technology. All student responses will be read by a minimum of two trained readers.
The Agency will play an integral role in guiding and monitoring all aspects of training readers and scoring essay responses. The vendor will chair rangefinder review and selection meetings with input from Agency staff. Agency content staff will review and approve all final scoring materials, monitor the training of readers and the scoring sessions. Agency staff should be expected to be on-site throughout training of readers and during most of the handscoring. When not on site, Agency staff will need to have online access to all handscoring systems and reports and will communicate frequently with the vendor throughout the scoring process.
3.2.2.4.20.1 Produce Handscoring Specifications

The vendor is expected to incorporate the procedural, design, and implementation requirements for scoring into written specifications developed for field test administrations and rangefinder reviews.


The vendor will produce updated handscoring specifications seven months prior to each Spring administration. The handscoring process and procedures from the previous administration will be reviewed and updated as needed after each administration in order to improve the processes for the next administration. The handscoring specifications will be a detailed guide to conducting handscoring and be used by the vendor’s handscoring managers and the Agency. The specifications will, at a minimum, include the topics listed in the next sections.
3.2.2.4.20.2 Conduct Performance Scoring Operations

Scoring responses requires a series of procedures designed to maintain the reliability and validity of test scores, provide scoring reliability, quality control, and adequate management control. This RFP calls for the vendor to implement handscoring processes and procedures according to the Agency’s requirements. Enhancements to these processes are acceptable when approved by the Agency.


3.2.2.4.20.3 Conduct Rangefinder Review Meetings

The purpose of the rangefinder review meetings is to review the criteria, supplement the initial set of rangefinder papers with additional papers, and, if necessary ensure that the vendor’s scoring directors and Agency staff share the same detailed understanding of the scoring criteria for the operational responses.


The scoring standards established by the initial field-test rangefinder selection will be maintained during the subsequent rangefinder review. A meeting to review rangefinder papers and scoring guides for prompts/passages included in the operational tests will be conducted in a mutually agreed upon location prior to the beginning of handscoring training for each administration.
The vendor’s writing assessment lead scoring director and the grade scoring directors will participate in meeting with up to eight West Virginia educators and a Agency staff member, who will serve as chair, to review scoring criteria and rangefinder papers. Separate meetings will be conducted for each grade. The vendor will prepare necessary materials for the meetings
3.2.2.4.20.4 Conduct Rangefinder Selection Meetings

Prior to the scoring of field-test responses, the vendor will be responsible for organizing and implementing meetings to select rangefinder papers for the essays. Vendors must include a schedule for these meetings. For each prompt the vendor will select from samples and provide adequate student responses to the essay which represent a statistically sound sample size.


Following the rangefinder selection meetings, the vendor’s staff is responsible for selecting training, qualifying, and validity responses and for annotating rangefinder and training responses. Requirements for selecting validity sets are identified in the following section. Vendor staff is also responsible for developing as part of the scoring guides any additional scoring notes or criteria required to conduct accurate scoring of the essays. The Agency will approve all selections, annotations, or other materials to be used in training. These selections, annotations and materials must be entered into the content/form construction system.
3.2.2.4.20.5 Produce Scoring Materials

The vendor will develop an electronic system for cataloging and storing all scoring materials developed during the course of the project. The vendor is responsible for producing the following scoring materials for each field test and operational test:



  • scoring guides

  • training sets

  • qualifying sets

  • validity sets

  • group discussion sets

  • recalibration sets

Agency staff will work closely with the vendor’s staff to prepare scoring materials. Meetings between Agency and vendor staff will be held following the rangefinder review meeting to initiate the development of scoring materials. All scoring will be submitted to the Agency for review and approval. Scoring materials must be approved at least three weeks prior to the beginning of training and scoring. The vendor will be responsible for developing a detailed schedule to be included in the handscoring specifications identifying steps in the development of scoring materials. At the completion of scoring, the vendor will provide the Agency with organized electronic copies of all scoring materials prepared for and utilized during scoring.


3.2.2.4.20.6 Handscoring Reports

The vendor will produce daily and cumulative reader handscoring scoring reports and will also have the capability to produce any of the reports on request throughout scoring. Reports are required for both field test and operational scoring. A subset of the reports will be transmitted to the Agency daily; other reports will be transmitted to the Agency periodically. A subset of these reports, including the primary inter-rater reliability and validity reports, will be available in real-time to scoring directors and team leaders. The Agency will determine, in consultation with the vendor, which of these reports will be available to the Agency in real-time. The vendor must identify and describe the proposed reports used both externally and internally to monitor the quality and pace of the scoring session.


At the completion of field test and operational scoring, the vendor will provide the Agency with final copies of all cumulative handscoring reports. The handscoring summary reports are to be made available electronic files on CD ROM. The vendor will produce a Technical Report that summarizes the score reports and provides details related to the reliability and validity of the field test and operational handscoring procedures.
3.2.2.4.20.7 Scoring Student Responses

Program rubrics and scoring criteria are holistic in nature, requiring reference to rangefinder papers for scoring decisions. The rubrics for the Writing Assessment responses will be the West Virginia Six-Point Rubric which represent focused analytic scoring, identifying Organization, Development, Sentence Structure, Word Choice and Mechanics, as the elements for consideration in scoring. All responses for field-test essays will be scored independently by two readers. All operational handwritten responses (Option # 3 and # 4) will be scored independently by two readers.


Assignment of responses across schools must be randomized to the extent that an individual reader scores responses from several schools within the same time period and does not score responses from the same school in succession. The Agency will work with the vendor to create a specific set of scoring rules for resolving disagreement between first and second readings with third for calculating final scores.
For Writing Assessment scoring, note the following:

  • unscorable scores are assigned by the site or scoring directors in conjunction with Agency staff.

  • discrepancies of one (1) point are resolved by averaging the scores.

  • discrepancies of two (2) or more points are resolved by an independent third reading.

  • third readings are conducted by the scoring director; complex resolutions and unscorable decisions are resolved by the site or scoring directors in conjunction with Agency staff.

The Agency’s alert system to identify students whose responses indicate the need for an outside agency’s intervention will be implemented during scoring. The site scoring directors will send copies of the alert papers identified by readers to the Agency each day during scoring by using an overnight delivery service or through electronic media.


3.2.2.4.20.8 Monitor and Maintain Handscoring Quality

Monitoring and maintenance procedures are intended to establish and maintain high levels of scoring accuracy. An important element of these features is they must result in quickly identifying individual readers failing to maintain acceptable scoring standards and using corrective strategies. The vendor must be prepared to utilize all procedures identified in this section. The vendor will also be expected to contribute additional ideas and procedures to monitor and maintain handscoring quality.


As part of the imaging and handscoring specifications for each administration, the vendor, in consultation with the Agency, will plan the combination of monitoring and maintenance procedures to most efficiently maintain the required high levels of scoring accuracy. The Agency will give final approval to these procedures:

  • Daily Systematic Review of Handscoring Reports

  • Systematic Read Behinds

  • Targeted Read Behinds

  • Scoring Validity Sets

  • Automatic Targeting

  • Targeted Validity Set Administration

  • Pseudo-Scoring

  • Group Retraining

  • Individual Conferencing

  • Dismissal -- The vendor will dismiss readers who fail to perform satisfactorily following retraining.


3.2.2.4.20.9 Handscoring Personnel

All project directors, scoring directors, team leaders, and readers must have earned a bachelor's degree. All personnel must sign an agreement with the Agency they will maintain the security of materials in addition to security agreements required by the vendor. The vendor must describe their screening process for hiring personnel associated with scoring. To be hired as a reader, trainees are required to meet established standards. A reader must maintain a minimum 70 percent perfect agreement.


3.2.2.4.20.10 Scoring Directors for Handscoring

The vendor will assign its most qualified scoring staff to direct scoring for West Virginia’s responses. They must have an appropriate educational background and extensive experience in directing state-level performance projects as members of the vendor’s regular scoring staff. All scoring directors must have worked in scoring director roles for the vendor on a regular, continuing basis. The vendor will appoint a project director to serve as the vendor’s overall director for the project.


The project director must be available on a daily basis to discuss issues with site scoring directors and the Agency either in person or via phone, email, or fax throughout the training and scoring sessions. The site scoring directors will be on site throughout the training and scoring sessions and will personally assist scoring directors during the training of team leaders and readers and throughout the scoring sessions.
The scoring directors must participate in the rangefinder review and selection meetings. The scoring director for each grade will work to ensure the rangefinder selection and the ongoing direction of operational scoring are both conducted at the highest levels of quality.

For each assessment, the project and site scoring directors will conduct training for scoring directors with the assistance of a Agency staff after the completion of the rangefinder review meeting.


3.2.2.4.20.11 Team Scoring Leaders

Team leaders must go through the same screening process as readers. The Agency requires that team leaders have previous experience as readers and as team leaders if at all possible. At a minimum, team leaders must be experienced readers and be degreed in the assigned content area.


3.2.2.4.20.12 Recruit and Hire Readers

Vendors will include an analysis of the number of people that must be recruited, hired and subsequently qualified as readers to complete the scoring within the time required to return reports to counties by the dates designated in the project schedule. The vendor will describe the number of team leaders needed for each grade. This detailed analysis must be completed in the proposal for the 2008 field test and 2009 operational administrations.


3.2.2.4.20.13 Training and Qualifying of Readers

The vendor will conduct separate training session for each prompt for each grade level of Writing Assessment essay responses. The vendor will determine the training and qualifying of readers. The scoring director will conduct training with the assistance of team leaders, under the direction of site scoring directors.


The purpose of the training is to ensure each person who scores responses has met the Agency's standards for scoring. The training process is essential for ensuring scores assigned to student responses provide valid and reliable information. The vendor is responsible for developing training procedures in consultation with the Agency and the Agency will have final approval on all training techniques
At the conclusion of training, qualified readers will be taught how to use the Agency’s alert system to identify students whose responses indicate the need for an outside agency’s intervention.
3.2.2.4.20.14 Scoring Sites

Writing Assessment scoring must be conducted at the vendor’s established scoring sites that draw on the vendor’s most experienced pools of readers who participate in image-based scoring activities on a regular basis throughout the calendar year unless the decision has been made to score student responses in West Virginia.


The vendor’s proposal will identify the number and locations of proposed scoring sites and which grades the vendor intends to score at each site. Due to the size of the program, vendors may be required to dedicate entire scoring sites to the handscoring of West Virginia essays. The number of sites may not exceed two for handscoring essays without Agency approval. A grade level may not be scored in more than one site. The Agency reserves the right to approve scoring sites and the distribution of grade scoring across sites.
At the Agency’s option, observers may be allowed access to the scoring centers for brief periods of time for the purpose of generally understanding the process. A Agency official or vendor staff designated by the Agency will accompany such visitors.
3.2.2.4.20.15 Expedite Performance Scoring

As stated, the Agency desires that the tests be scored and reported in the most expeditious manner possible. Vendors are expected to consider alternatives to make it possible for the statewide assessments to be processed according to a timeline shorter than the one specified. The Agency must approve any alternatives the vendor proposes. Security is of the utmost importance and any proposed scoring solution must provide security guarantees. For purposes of this proposal, all vendors must submit proposals that meet the common minimum approach set forth in the requirements of this RFP.


3.2.2.4.20.16 Overall Scoring Quality Control

The vendor shall provide quality control systems to verify the accuracy of the scoring, processing, and reporting of data. In addition, the vendor will provide the results of these quality control reviews to the Agency so that the Agency can ensure any identified problems have been rectified. In addition, the Agency may operate its own quality control operations. In so doing, the Agency may utilize the services of one or more vendors to assist in verification of the quality and accuracy of the statewide assessment results. These vendors will work under the direction of the Agency and will perform data verification checks at times and places so designated. The vendor will be obligated to provide data, information, explanations, and work space, if necessary, for data verification vendors working with the Agency.


The objective of the quality control processes is to triangulate analyses and verify the data being reported are correct. The Agency will review all quality control findings and will provide permission for the vendor to prepare and distribute test results.
The vendor also must provide a method whereby each student essay is linked to its author and an organization of documents that will facilitate searching for specific documents. The vendor must propose a solution which enables the Agency to read the files and print copies of an individual essay if needed. After acceptance of accurate computer files by the Agency, the vendor will store student essays, at vendor’s expense, for a period of two years. At the end of the two-year period, the vendor will transfer or destroy the essays according to instructions from the Agency.
3.2.2.4.21 Online Writing Technology Specifications

The Agency is looking for costs for an efficient web-based summative online writing assessment system. If an online system option is selected, the system must meet the minimum requirements set forth in the RFP. First and foremost the system must produce results that are valid and reliable. The vendor must provide a web-based system that is secure to minimize cheating and testing violations and protects both the testing environment and student data. Student scores, data, test results and analysis will belong to Agency. The interface must be user friendly and easy to navigate and should utilize technology that will operate with minimal bandwidth usage. As for all other assessments, the vendor will provide research-based validity and reliability information and technical data to be included in a Technical Document to be produce after each administration.


If requested by the Agency, the vendor shall conduct a small scale pilot of online testing and scoring for future administration of the WESTEST program. The online pilot shall be conducted for the 7th grade Writing Assessment component. As part of the pilot, the online assessment must be completed by a minimum of 3,000 students per content area. The vendor in conjunction with the Agency will identify a representative sample of schools to participate in the pilot and shall provide all necessary materials and technology (software, hardware, connectivity) to complete the pilot assessment. The contractor shall provide scoring of assessments completed in the online pilot.
If indicated, the vendor shall complete a study of the comparability of the results of the online assessments and the paper/pencil assessments and shall provide a written report summarizing the findings of the study to Agency. The vendor shall provide evidence of the validity of online testing and scoring in the context of a large-scale assessment.
The vendor shall submit a detailed written plan to the Agency of an expansion of the online assessment to include additional grade levels as specified by Agency. The vendor’s plan for the expanded online assessment shall identify specific activities that will be performed by the vendor and a guaranteed not-to-exceed total price for the online assessment. Such guaranteed not-to exceed total price shall be based upon the firm, fixed price per student for the Online Assessment Pilot as stated in the pricing proposal.
The testing platform must provide:

  • an online procedural tutorial that teaches students how to use the testing system

  • an easy navigate interface within the exam

  • the ability to easily to return to exam questions when a student has either stopped the exam for a brief or extended time, or when a student may have skipped a question during testing

  • a feature that shows both answered and unanswered exam items

  • a confirmation request feature that students must respond to before exiting the exam

  • clear, easy-to-view graphics that support ease of readability and focus

The vendor must provide for committee review screen captures of the following items: student log-on screen, student test and student procedural tutorial demonstrating how to use the testing system.


Vendors systems must work with the existing technology infrastructure of the Agency and the West Virginia County school systems (Refer to Section 3.1.1). The vendor must ensure that all schools using this product are able to successfully utilize the system. The vast majority of schools in West Virginia have a partial T1 or a full T1 for connectivity to the internet. A small minority of schools use a dial-up system for internet access. All internet traffic is routed through the state wide infrastructure before connecting to the public internet.
Vendors should also realize that although the vast majority of West Virginia Schools deploy their computers in a lab type setting not all schools have opted for this approach. Each individual county school system is responsible for the updating and upgrading of the computer system in their system. The school level personal computers Computer hardware and operating systems utilize Microsoft® Windows 98 or greater.

The vendor should describe and provide evidence on how the proposed technology solution must work in this environment and identify the following:



  • Minimum hardware requirements for classroom use. The minimum hardware requirements for your software should not exceed the configuration found in Section 3.1.1 Table 2. Your software should operate on the above classroom computer configuration as well as configurations found in the Digital Divide to serve the maximum number of students.)

  • Minimum hardware requirements at other locations.

  • Minimum software requirements including operating system, browser, etc.

  • Configuration of the technical route/specifications from student access/input to final report.

  • Bandwidth requirements.

  • Software specifications for processing input, how data is saved, how often data is sent to storage or remote site, etc.

  • Technical issues with firewalls, desktop security programs, etc.

  • Any other technical requirements or issues.


3.2.2.4.21.1 Security

The vendor must provide evidence that the system is secure. The system will meet federal compliance regulations for both FERPA (Federal Educational Rights and Privacy Act) and COPPA (Children’s Online Privacy Protection Act of 1998). The Agency requires a system that has a locked-down Desktop that will not allow students to leave and re-enter the program or go to other Internet sites, programs, emails, or files.


The vendor will assure that the spell and grammar check features are not accessible to students during testing. Students also should not have access to a tutor, thesaurus or other instructional aids during the testing. The vendor will assure Agency that the West Virginia Scoring Rubric and the Writing Checklist will be available to students during testing. The vendor will be responsible for assigning secure student passwords/logins necessary for entering the testing platform.
The program must be able to track student users via a secure system of user IDs utilizing each student’s WVEIS number. The testing platform must include a user interface that will be password protected with unique alpha-numeric passwords that are changed on a regular schedule, or for each individual assessment. The vendor will work with Agency in order to obtain a list of users to be loaded into the system and to determine the password and user name schemes to be employed.
3.2.2.4.21.2 Administration

In order to assure that the vendor’s delivery system is reliable, the vendor in conjunction with the Agency will conduct a pilot test in a variety of schools in approximately 15 counties.


The vendor will assure reliable delivery of the operational test. The vendor will provide documentation that the platform has built-in features to prevent the loss of student essays both during administration and during scoring/reporting.
The vendor will assure reliable delivery of all online assessments. The vendor will provide documentation that the platform has built-in features to prevent the loss of student data both during administration and during scoring/reporting. The vendor must, provide for committee review, the vendor specifications for prevention of data loss. The student level data must be encrypted and located behind a secure firewall.
3.3.2.4.22 FTP Site

Vendors must include provisions for a secure FTP site for data transfer from the vendor to Agency and from Agency to the vendor. This site must be maintained by the vendor to ensure its continued availability throughout the life of the program.


3.2.2.4.23 Scoring/Reporting

The summative score must be determined by trait to yield a composite of all analytic trait scores. The summative score will be used for WESTEST score. This composite score must be translated into cut scores by performance levels via an appropriate standard setting procedure. Analytic trait scoring will provide for an individual analysis of the strongest and or weakest traits for student report and aggregate grade level reports (school, county, and state).

The vendor must provide evidence that the vendor platform is capable of:


  • scoring/reporting/aggregating up to 4 prompt types

  • providing reports that aggregate scores for federally designated AYP subgroups

  • providing holistic and analytic scores for the individual student

  • providing paper-based quality Individual Student Reports

  • providing web-based School/County/State Reports

  • providing a human review of No Scores if artificial intelligence scoring is used

  • providing a system for identifying and retrieving alert papers

  • providing 10% human read behind

  • monitoring to identify open essays

  • providing weekly progress reports on completed and open essays

  • electronic or manual random generation of prompt assignments within schools and classrooms

  • field testing the prompts in the same environment that will be used for the operational assessment

  • archiving and retrieving essays for a minimum of one year


3.2.2.4.24 Reporting

The Agency desires easy-to-understand reports that are creative, attractive and technically defensible. Vendors should present for innovative report designs that take advantage of current technologies for color printing and data merging. Reports should look similar to the WESTEST (http://westest.k12.wv.us/reports.htm) and Writing Assessment reports .(http://writing.k12.wv.us/writingresults2006.htm#state) The reports must provide numeric, verbal, and graphic presentations of assessment results that effectively communicate with intended audiences, including students, teachers, parents, and the general public.


The proposal should document the capabilities of the vendor to fulfill West Virginia report requirements within the prescribed time limits. Vendor procedures and report guidelines are found in this section. The vendor’s references must be able to substantiate these capabilities.
At a minimum, the vendor must supply the reports listed in the Reports Descriptions Section (See Section 3.2.2.4.24.1). The proposal must document the capabilities of the company to fulfill West Virginia reports requirements within the prescribed time limits. The vendor’s references must be able to substantiate these capabilities.
The vendor shall provide for committee review a detailed scoring and reporting plan that documents the accuracy of all scoring and report production programs. This plan must detail the process of quality reviews of data and the printed report output during report processing and live online report production.
Minor adjustments to the reports should be anticipated by the successful vendor. In responding to the reporting requirements, vendors are encouraged to suggest combinations of report formats or innovative graphic or numeric displays.
3.2.2.4.24.1 Reports Descriptions/Timelines

Vendor's proposed timeline for completing proposed services must respect the Spring 2008 Field Test and 2009 Operational Test administration dates. All student reports are to be separated by school before being shipped to counties or to Agency.


In addition, the vendor will be prepared to process missing or erroneous reports throughout the duration of the contract. Copies of data files for each test administration shall be maintained throughout the duration of the contract. Three distribution levels will be specified - school, county, and state. Individual Student Reports will be shipped to the County Test Coordinator at each central office.
While the Writing Assessment scores will be combined with the WESTEST Reading Language Arts scores, separate Writing Assessment reports shall be generated within 4 weeks of last date of testing window:
1. STUDENT REPORT:

The individual Student Report will at a minimum include the following:



    • Student name, grade, date of birth, WVEIS #, class, school, county, state, prompt and explanatory information about the scores

    • Student analytic scores, total summative score, performance level

    • Definition of terms

    • Analytic rubric

    • Performance level descriptors

      • One side of student report will capture the performance level descriptors


2. CONFIDENTIAL ROSTER REPORT:

The Confidential Roster Report shall be organized by grade by school and by grade by county and shall at a minimum include the following:



  • Student names in alphabetical order (last name, first name, middle initial), grade, date of birth, WVEIS number, school, county, test date, prompt type

  • Analytic trait scores for students, summative score, performance level


3. CONFIDENTIAL SUMMARY REPORTS (School, County and State):

The West Virginia Confidential Summary Report shall summarize the writing performance levels for all students of the same grade who wrote to the same prompt. This report shall be prepared for all schools, counties, and state. They shall contain a graph of the percent of students who attained each performance level category. The report shall also show the distribution of analytic trait scores for the group. The report shall be organized by grade by school, by grade by county, and by grade by state and shall at a minimum include the following:




  • Grade, state, test date

  • Number of students tested by subgroups (all, gender, race/ethnicity, students with disabilities, Limited English Proficient students, migrant, economically disadvantaged)

  • Performance levels by aggregate number and percent of students at each performance level

  • Number of students tested by grade level mastery of writing standards

  • Definition of all unfamiliar terms on the report

  • Writing performance levels for all students of the same grade who wrote to the same prompt shall be summarized together

  • Writing performance levels for all students of the same grade level must be reliably combined with the Reading Language Arts score


4. CONFIDENTIAL AGGREGATED SUMMARY REPORT (School, County and State):

The West Virginia Confidential Aggregated Summary Report shall summarize the writing performance levels for all students of the same grade who wrote to all prompts. The report shall contain a graph of the percent of students who attained each performance level category. The report shall also show the distribution of analytic trait scores for the group.

The report shall be organized by grade by school, by grade by county, and by grade by state and shall at a minimum include the following:


  • Grade, state, test date

  • Number of students tested by subgroups (all, gender, race/ethnicity, students with disabilities, Limited English Proficient students, migrant, economically disadvantaged)

  • Performance levels by aggregate number and percent of students at each performance level

  • Definitions of all unfamiliar terms on the report

  • Writing performance levels for all students of the same grade who wrote to all prompts shall be summarized together

Upon all corrections being made with vendor and Agency, the vendor will supply all corrected reports to Agency via an FTP site within 8 weeks of receipt and scoring of assessments.



5. GENERAL RESEARCH FILE:

A report shall be programmed and made available to provide electronic data for preparing accountability reports. This file shall be organized by school, grade, county, and state, subgroups, and shall agree with the data reported on summary lines in the school, county and state level reports.


6. Student Label:

    • The Student Label will at a minimum include the following:

      • Student name, grade, school, test date, gender, date of birth, WVEIS number, content area, prompt type, analytic score, summative score, performance level

      • Self-adhesive to allow attachment to the student record


3.2.2.4.24.2 Develop Specifications

The vendor is responsible for developing specifications for each administration that describes in detail all the steps to be implemented and to demonstrate to the Agency that the final reports of results are accurate. The vendor is expected to incorporate the procedural, design, and implementation requirements for reporting tasks into written specifications initially developed by September 2007 for the Spring 2008 Field Test administrations. The vendor must produce final specifications and mockups of proposed report forms for each following administration within a similar timeline.


The vendor is responsible for drafting specifications for each report that include the following:

  • a description of the report

  • how the data on each report are generated (i.e., which population of students)

  • in which shipment the report is included

  • who receives the report with the number of copies received

  • a sample of the report

This plan should detail the process of electronic quality reviews for the data and the printed report output during report processing and live report production. The vendor will work jointly with Agency on finalizing the scoring rules and final reporting considerations shortly after award.


3.2.2.4.24.3 Report Development

The units of analysis for inclusion in West Virginia reports are the student, class, school, county, and state. At the school, county, and state levels, reports will include subgroup results (e.g., economically disadvantaged students, students with disabilities, students with limited English Proficiency (Limited English Proficient), major racial and ethnic groups, gender, and migrant). Special education reports will also be provided at the school, county, and state levels. Test results will be reported by scale score and achievement level.


Test results will be reported by analytic trait scores of Organization, Development, Sentence Structure, Word Choice and Mechanics; a summative score of the analytic traits; and an achievement level. Also, Writing Assessment results will be combined with WESTEST Reading/Language Arts scores.
The vendor and Agency will extensively review all data files before they are used to produce live reports. The vendor must produce a live data file with a sample population composed of at least three counties selected by the Agency. This file will be used to check student-level and aggregated data for each grade. Reports will be created from this live data check file; both the file and reports shall be sent to Agency for verification and approval.
The Agency will review the data and draft reports and will work with the vendor to resolve any questions. The Agency expects the vendor to conduct an extensive quality check before the file and final reports are sent to the Agency.
3.2.2.4.24.4 Update Report Designs

The vendor is responsible for annually reviewing and updating the design of the individual student, school, county, and state reports of test results in consultation with the Agency. Though it is expected report formats will not change extensively from year to year, the vendor should, after each administration, pursue reporting requirements from Agency and make any changes required by Agency until final approval is given. No extra cost may be charged to the Agency.


3.2.2.4.24.5 Report Delivery

During each administration, numerous reports and data files are provided to students, schools, counties and the state for students, parents, educators and the public with data aggregated in various ways. The actual reports and data files to be generated are described in Section 3.2.2.4.24.1. The vendor must prepare the data files using formats approved by the Agency. In addition to reports of results, there are also additional reports, including missing secure materials reports, duplicate tester reports, etc. Requirements are established for many reports to be available as electronic files in formats compliant with Section 508 of the Rehabilitation Act (Refer to: http://www.section508.gov/) and to allow the files to be both viewed on a website and downloaded.


3.2.2.4.24.6 Report Phases

The vendor will not provide individual student data or reports for the field test administration.


Following vendor quality checks, reports for the spring test administration should be delivered to the Agency and counties via secure FTP site within 8 weeks after the vendor receives the tests. All student reports will be sent to the County Test Coordinators. The Agency expects to distribute school and county reports via the secure FTP site. The vendor and Agency will develop a process by which school/county PDF files can be loaded onto the FTP site.
All student reports for schools are to be packaged by school but sent to the County Test Coordinators. All printed products will be proofed by the vendor, and copies will be sent to the Agency for proofing and approval prior to mailing any product to the counties.
The Agency must review at least one copy of each report before shipment of any reports. All reports should be original printed copies. The vendor shall be responsible for maintaining copies of electronic data files for each test administration.
The Agency reserves the right to request some records be removed from processing until specific issues are resolved. These issues include duplicate records, records with blank student identification numbers and/or blank names, schools or students whose test records are under investigation for possible cheating, or other issues that might affect school totals. The issues regarding the suppressed records will be dealt with as soon as possible after reporting is completed.
The vendor may be requested to change the score reported flag on the file to one that would not report the student’s score, pull test documents to resolve duplicate tester issues, add a corrected student identification number or corrected name to a record, produce Individual Student Reports (as directed), and/or Confidential Roster Reports. The vendor will work with the Agency to establish a timeline for the processing and reporting of these records.
3.2.2.4.24.7 Electronic Records

For each administration, the vendor will also supply the Agency with an electronic file, in a format approved by the Agency, containing data aggregated by grade and subject for each school, county, and the state. These electronic records will agree with the data reported on summary lines in the county, state, and school level reports. Additional summary statistics for each school and county and the state will be reported by disaggregated characteristics such as racial/ethnic group, gender and other demographic information. Every summary statistic printed in the paper reports shall be represented in this file.


The vendor will be responsible for checking to ensure all files are consistent and accurately reflect the data provided on the reports. The Agency will independently verify the consistency and accuracy of the data files.
3.2.2.4.24.8 Optional Reporting Services

The Agency requests that each vendor provide costs/quotes for any optional services, enhancements or projected updates in the proposal. Please provide information about product efficiency, usability, expanded reporting capabilities and costing. These optional services must be available upon request by the Agency.


3.2.2.4.24.9 Disaster Recovery Plan

The vendor shall provide a description of the plan to backup all systems, applications, and databases routinely to an onsite and offsite location. Additionally, the vendor shall detail the plan for data recovery in the event a disaster is declared where the data is maintained and stored. Database transaction logs should be archived and maintained online for 48 hours.


3.2.2.4.25 Data Management

Agency requires the vendor’s data management system to interface with the Agency’s data management system. A general information file of individual student performance must be submitted to the Executive Director, Office of Technology, immediately upon completion of scoring; please note the vendor has three weeks to score multiple choice test options and six weeks to score tests with constructed response items.


The Agency requires vendors to provide software solutions to the data management and disaggregation of data at the class, grade, school, county and state levels. In addition, disaggregated group reports must be available by the following classifications:

  • Limited English Proficient

  • race/ethnicity

  • gender

  • free/reduced lunch,

  • migrant

  • special needs

  • other groups as specified over the life of the contract

School, county, and state level reports will be provided by the vendor through a secure FTP site in the same format as the specified reports. The vendor will work in conjunction with the Agency to finalize the data and data layout. The Agency will have final approval of variable names and the formatting layout. All optional items for purchase will follow the established format.


Any proprietary software required (along with all software support) to read the data must be included for the Agency and updated throughout the contract. Vendors are to describe this software in full. If additional copies will be required at the county level, pricing for this must be included for the life of the contract.
3.2.2.4.26 Final Delivery of Materials

The vendor agrees to deliver to the Agency or destroy, upon request, all materials and products in all forms developed for and used in conjunction with this project within 30 days following acceptance by the Agency of the final report for the project including the following:



  • test items and performance tasks

  • graphics

  • scoring materials

  • tests books

  • answer documents

  • final electronic files of ancillary materials

  • computer discs, CDs, DVDs, or other media

  • computer listings

  • computer files

  • paper files

Payment of the final project invoice will not be made until all materials and certification of destruction, as appropriate, are received and approved by the Agency and final payment resolution is agreed to by both parties. Written verification of the delivery or destruction will be provided to the Agency as part of the final contract report.


3.2.2.4.27 Transition Plan

At the conclusion of the program, either through the successful completion of the contract period or through termination, it is expected the vendor will directly and fully participate in the transfer of the program to another vendor. Several steps will be involved, and the vendor of the program associated with this RFP is expected to take a leadership role. Vendors are expected to provide a plan describing, at a minimum:



  • Duration of transition

  • Transition Meetings

  • Team members designated

  • FTE required by person

  • Determine core transition team

  • Establishment of common definitions for terms

  • Documentation of all meetings and agreements in detail

  • Determine contractor responsibility for each aspect of important deliverables

  • Documentation of all transferables as sent to new vendor and new vendor’s inspection of all materials

  • Documentation/identification of all psychometric files and like documents to be transferred

  • Monitoring timely delivery of agreed transferables

  • Plan appropriate replication studies involving Agency Technical Advisory Committee - equating and handscoring studies are a minimum

  • Transfer of all student essays supplied by West Virginia students.

  • All student essays supplied by the Agency shall be the property of the Agency.

Vendors’ proposals should address completing a full plan for transition prior to the conclusion of 2008. This plan will be updated annually to reflect new information.


TABLE 34: Transition Plan


Meeting Number

Meeting Purpose

1

Discuss Data Collection, Scanning, and Processing

2

Discuss Analysis and Reporting

3

Discuss Master Schedule

4

Discuss Psychometrics, including replication study

5

Discuss Handscoring


Section 3.2.2.4.27.1 Transition Psychometric Issues

Vendors should define in their proposals a set of key psychometric issues to be considered at the time of transition. At an absolute minimum, equating and handscoring replication studies will be included. An exploration should be conducted (and defined here) to determine whether change (positive/negative) in test results is solely related to student performance and not change(s) in testing methodology/material from vendor to vendor. The final transition plan will include a literature review detailing past efforts in transitioning programs with a series of recommendations.




Table 35: West Virginia Materials Production Specifications

Writing Assessment Field Test 2008


Item #

Admin.

Grade

Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

1

Spring 2008 FT

3 – 11

Field Test Scannable Answer Document

2 colors for cover

2 colors for inside



10s

4

1

2

Spring 2008 FT

3 – 11

Test Examiner Manual

2c/1c

1 per package

48

1

3

Spring 2008 FT

3 – 11

Test Coordinator Manual

2c/1c

# of examiners per school

48

1

4

Spring 2008

FT


3-11

Security Checklist (if applicable)

1c




Triplicate Forms

1

5

Spring 2008 FT

3 – 11

District Header

3c




2

1

6

Spring 2008 FT

3 – 11

School Header

3c




2

1

7

Spring 2008 FT

3 – 11

Answer Book Envelopes

N/A

N/A




N/A

8

Spring 2008 FT

3 – 11

Labels

1c

# students per school/per grade




1

9

Spring 2008 FT

3 – 11

Materials Checklist

1c




2

1

10

Spring 2008 FT

3 – 11

Memos that Vendor sends out







2

1

11

Spring 2008 FT

3 – 11

Large Print













12

Spring 2008 FT

3 – 11

Braille














Table 36: West Virginia Materials Production Specifications

Writing Assessment Operational Test 2008 – 2014


Item #

Admin.

Grade

Description

Color Specs

Packing Quantities

Pages/
Form


Number of Forms

1

2009 – 2014

3 – 11

Field Test Scannable Answer Document

2 colors for cover

2 colors for inside



10s

4

1

2

2009 – 2014

3 – 11

Examiner Manual

2c/1c

# of examiners per school

48

1

3

2009 – 2014

3 – 11

Test Coordinator Manual

2c/1c

1 per package

48

1

4

2009 – 2014

3 – 11

Security Checklist (if applicable)

1c




Triplicate

Forms


1

5

2009 – 2014

3 – 11

District Header

3c




2

1

6

2009 – 2014

3 – 11

School Header

3c




2

1

7

2009 – 2014

3 – 11

Answer Book Envelopes

N/A

N/A




N/A

8

2009 – 2014

3 – 11

Labels

1c

# students per school/grade

times six years






1

9

2009 – 2014

3 – 11

Materials Checklist

1c




2

1

10

2009 – 2014

3 – 11

Memos that Vendor sends out







2

1

11

2009 – 2014

3 – 11

Large Print













12

2009 – 2014

3 – 11

Braille













1   ...   10   11   12   13   14   15   16   17   ...   38




The database is protected by copyright ©sckool.org 2020
send message

    Main page