Request for Proposal (rfp)



Download 5.9 Mb.
Page3/38
Date09.08.2018
Size5.9 Mb.
TypeRequest
1   2   3   4   5   6   7   8   9   ...   38

3.2.1.14.3 Test Examiner’s Manual


A separate manual will be developed in conjunction with the Agency for WESTEST K-2 for the field test and for each operational year. After the first operational year, the manual will be revised to reflect needed changes. The vendor shall develop Test Examiner Manuals which will contain instructions for administration for the appropriate grade level assessments, general information about how to conduct the assessment and specific test instructions. They also will include information about security of materials. The manual will be available in an electronic format accessible via the Agency’s website prior to the hardcopy distribution.
The vendor shall submit to the Agency in a timely manner the final proofs of the test booklets and answer documents along with the proofs of the Test Examiner’s Manuals to ensure the testing instructions are consistent with the testing instruments. All manuals will be submitted to the same proofing and printing stages as the test booklets. The vendor shall submit a schedule for second copy reviews, camera copy reviews and blueline reviews. The vendor shall ensure manual proofs are free of typographical and format errors before they are submitted to the Agency. The vendor should include examples of Test Examiner’s Manuals for committee review.

3.2.1.14.4 County Test Coordinator’s Manual


A County Test Coordinator’s Manual will be developed for the field test and for each operational year. After the first operational year, the manual will be revised to reflect needed changes. Manuals will contain coordinator-level instructions for handling and administering the appropriate grade level assessments. They also will include information about security of materials. The manual will be available in an electronic format accessible via the Agency’s website prior to the hardcopy distribution.

The vendor shall submit to the Agency in a timely manner the final proofs of the test booklets, answer documents, and Test Examiner Manuals along with the proofs of the County Test Coordinator’s Manual to ensure the testing instructions are consistent with the testing instruments. The vendor shall ensure that manual proofs are free of typographical and format errors before they are submitted to the Agency. All manuals will be submitted to the same proofing and printing stages as the test booklets and the Test Examiner’s Manual. The vendor should include examples of County Test Coordinator Manuals for committee review.


3.2.1.14.5 Other Ancillary Documents

The vendor will also print any additional materials needed to implement the project, such as transmittal memoranda, labels for packing and packing lists. These documents are to include, but are not limited to:



  • County Header

  • School Header

  • Answer Documents Envelopes

  • Labels

  • Materials Checklist

  • Memo

All ancillary documents will be submitted to the same proofing and printing stages as the consumable test booklets. The vendor shall ensure all proofs are free of typographical and format errors before they are submitted to the Agency. The vendor shall submit a schedule for reviews of the manuals and other ancillary materials.


3.2.1.14.6 Braille and Large Print Documents

For student level test documents, the vendor will provide sufficient quantities of Braille and large print versions at each grade level for visually impaired students. A publisher of Braille and large print materials approved by the Agency will produce the large print and Braille versions of the consumable test books and other documents at the vendor’s expense. The vendor will provide the electronic files in the correct format to Braille and large print subcontractor(s). Test administrator notes and scripts accompanying the Braille test versions will also be developed by the vendor.


For Braille printed products, the vendor is responsible for having the materials proofed by an independent party. Assurances must be provided to the Agency that all errors identified by the independent Braille proofreader were corrected. The Agency may also employ the services of internal, or external, Braille proofreaders. The vendor should provide a timeline for the review of all Braille materials.
Large-print documents will be printed using the current recommendations (Refer to http://www.aph.org/), in a minimum of 18-point type on approved paper. Should these APHB recommendations change during this contract, the Agency retains the right to require reformatting of documents to meet the new specifications. The bound test booklet should be no larger than 14” x 17”.
The development contractor will provide large-print and Braille versions of various documents referred to elsewhere in this RFP, in addition to the test forms. These publications will be produced so that they will be delivered to counties in the same shipment with the regular format versions of these products as identified in this RFP.

3.2.1.14.7 Materials Distribution/Retrieval

Many forms and materials are needed to implement the WESTEST K-2. Some of the materials listed in this section will assist schools, counties and the state in implementing quality control procedures and will ensure the integrity of the data collected by the program. The Agency also uses special forms to evaluate the quality of the assessment program and its implementation annually.


The vendor is responsible for all arrangements and costs associated with packing, distributing, and returning materials. Prompt and accurate delivery of materials is important to the Agency and to local county personnel who have the responsibility of managing test materials. There must be 100 percent accountability of all test booklets and answer booklets returned by the counties using bar code labeling systems. The vendor must guarantee distribution procedures are in place. Vendors’ proposals must include descriptions of the procedures they will use to complete these tasks.
3.2.1.14.8 Pre-identification of Answer Documents

The vendor will be responsible for pre-identifying answer documents using data transmitted via an electronic medium from the Agency for each administration. The vendor will assume pre-identification will be used for 100 percent of the school population. The vendor must establish a system to allow the Agency to transmit pre-identification data electronically over a secure data transmission network accessible only to authorized users. The vendor will bear the cost of establishing the system and providing network-specific software, if needed by the Agency, to access the system.


For the operational administrations, the Agency will submit pre-identification files to provide estimated counts for print materials. A second file will be submitted for labels and pre-identified answer documents. The vendor will then send labels (packaged by school) to arrive in districts two weeks before the test administration. Counties will have the option of requesting pre-identified answer documents to be packaged alphabetically by school, class, or other groupings as defined by the order in which student records are supplied on data files.

The counties will have the option of specifying the sort order for the pre-identified answer documents/labels. The vendor will establish a system to ensure the pre-identified answer documents/labels delivered to counties contain data that makes sense, reflect the options selected by counties, are accurate, and are printed at a level of quality that permits accurate scanning and precludes the possibility of smudging. The vendor will provide the Agency with a checking program to be used before submitting data to the vendor to help ensure all data fields include acceptable data.


Vendors will provide quality control throughout the printing process to ensure the quality of label printing. Pre-identified labels will be packaged and labeled by school and shipped to counties as part of the shipment containing test books and answer books.
3.2.1.14.8.1 Optional Technology Systems

Vendors shall also propose solutions/examples and other technologies to permit the pre-identified answer documents/labels to be printed on dates closer to the actual spring test administration dates. Proposals for other technologies to pre-identify answer books will be acceptable as long as the alternative provides at least the same level of timeliness, reliability, and accuracy as the labels. A pilot test of the vendor’s proposed system will be conducted for the Spring 2008 field test administration.


3.2.1.15 Packaging Specifications

3.2.1.15.1 Pack and Distribute Materials

The vendor will prepare packaging specifications to be delivered to the Agency four months before the field test and each operational administration. The packaging specifications will be updated as required to meet Agency needs. The packaging specifications will include the vendor’s procedures for packing and distributing materials to counties and receiving the materials from the counties. The specifications will include a description of how the materials are packed, examples of packing and inventory lists for boxes to counties and schools, methods used for distributing and returning materials, and a description of the procedures used to inventory materials as they are returned.


3.2.1.15.2 Quantities

The Agency will provide the vendor with a list of the current names, addresses, email addresses, and phone and fax numbers of the County Test Coordinators. These persons will monitor all aspects of the assessment for their counties. Vendors must ship materials to County Test Coordinators at approximately 57 separate sites. The number of counties and special schools that serve as counties may change slightly during the life of the contract. The vendor should provide examples of packaging specifications for the committee’s review.


Table 7 provides information about the estimated number of students to be the recipients of testing materials. Materials will be needed for approximately 460 public/private/parochial schools in West Virginia’s 55 counties and two special districts. A list of the primary materials to be shipped for each administration, the quantities to be packaged for schools, counties, and the Agency, and other packaging specifications are given in Section 3.2.1.7 -Table 6.
Table 7: Number of Public School/Private Parochial Students Based on 2006-2007 Enrollment Figures for Grades K-2 with a Built-in 5% Overage



Grade

NUMBER OF STUDENTS

K


22,800

1

23,000

2

21,800


3.2.1.15.3 List of County Test Coordinators

The vendor will be provided a list of County Test Coordinators at the beginning of the contract and will be responsible for maintaining and updating this list as notified by the Agency. At the beginning of the contract, the Agency will provide the vendor with a data file containing a list of the counties and schools (names and identification numbers), and the numbers of students by grade tested during the last year.


3.2.1.15.4 Packing, Distributing, and Receiving Materials Provisions

The vendor’s proposal and final specifications for packing, distributing, and receiving materials will address the following provisions:



  • The Agency will provide to the vendor an electronic school population update file indicating the number of students anticipated to be tested in each school. The vendor and the Agency will decide on the method used to determine the final number of students per school. This number will be the basis for determining the quantities of materials to be shipped for to the county for each school and the county. The vendor will generate packing lists based on these numbers.

  • The order of schools within a county on all lists and for shipping purposes will be by county/school WVEIS identification number.

  • The vendor will assure all materials will be shrink-wrapped in quantities specified by the Agency for shipping. No box will weigh more than 30 pounds.

  • The vendor will label the boxes of test books with the message “TO BE OPENED ONLY BY COUNTY TEST COORDINATOR” and mark all boxes with special colors, or labels, so they can be easily identified as secure materials. School boxes will be labeled with the number of the county and school and the name of the school. Labels must be approved by the Agency. Only boxes directed to the county-level staff, such as boxes containing county overage, will be labeled with the name of the County Test Coordinator. The vendor will label boxes on the top and number boxes as “Box 1 of X,” “Box 2 of X”, etc. where X is the total number of boxes sent to that county. The box containing the packing information will be clearly identified for both school and county materials.

  • The vendor will pay charges on all materials shipped to and from the County Test Coordinators. The vendor will make arrangements for and pay for shipment if, due to a delivery error, the county is asked to ship materials to another county. The vendor must use an overnight delivery service for such shipments. Vendors will identify carriers they propose to use and the procedures for delivery and return of testing materials. The Agency must approve all carriers.

  • Agency approval must be received before shipping printed products. Approval will be provided after the vendor and the Agency have received from the printer and have proofread examples of the printed products.

  • When applicable, materials will be packaged by school and sent to the County Test Coordinator in returnable boxes. The county will be responsible for distributing materials to the schools. The vendor is not responsible for any costs that schools may incur in shipping test materials from their schools to the county office, unless the need to ship is the result of a packaging error by the vendor.

  • The Agency and counties will decide how materials overage will be distributed to the counties and the schools for each shipment. Overage remaining with the vendor must be shipped by the vendor by one-day service, if necessary.

  • The vendor will provide an online system for County Test Coordinators to order additional materials. Also, the vendor will staff toll-free phone and fax lines during the period in which materials are shipped, additional materials are ordered, and materials are picked up from schools and counties. The service will utilize individuals, as needed, who are designated to respond only to West Virginia testing calls and perform other West Virginia work during this period. The vendor’s telephone lines must be staffed during Eastern Time Zone working hours.

  • It may be necessary for the vendor to ship the testing materials to the counties and the Agency in up to four separate shipments per testing window – not including short shipments. Braille and large print materials are to be packaged and labeled separately but included in the shipment of test materials to counties.

  • The vendor will prepay charges on return shipments from the counties. Return labels, prepaid postage labels, freight bills-of-lading, or electronic methods and instructions will be provided for the materials being returned.

  • The vendor will be responsible for mailing, or shipping by overnight delivery service, or other means, as appropriate, any miscellaneous materials to the Agency and counties as situations arise. The vendor must secure the services of shippers who will provide inside delivery and unload large shipments onto loading docks.

  • The vendor must develop procedures to monitor the receipt of all materials and develop error logs. The date materials are received, who signed for the delivery and any errors made by counties in packaging and completing forms must be documented. When problems arise, the vendor will be responsible for contacting the counties and the Agency concerning the problem and the resolution of the problem. The error logs will identify by school and county all failures to follow the established procedures, and, if appropriate, how the errors were resolved. The error logs will be delivered to the Agency immediately after materials from all counties have been received.

  • The vendor will provide a written report to the Agency documenting the check-in of all secure materials within 30 calendar days of initial receipt of the materials. If the vendor’s system for barcode verification is demonstrated to be unreliable, at the vendor’s expense, each document will be required to be scanned twice, creating independent data files that can be matched for scanning errors. Beyond these measures, the vendor will be required to manually check answer documents to ensure 100% accuracy of check-in for secure documents. Upon Agency request, the vendor will work to resolve discrepancies in the numbers of returned answer documents.

  • The vendor will provide to the state, county and individual schools a checklist, or a similar procedure, for tracking secure answer documents.

  • The vendor will include in their check-in procedures a method of checking for and retrieving used answer documents erroneously packaged with unused materials or invalidated materials.

  • These “orphan” answer documents will be processed and scored under extended timelines. The vendor will work with the Agency to establish a timeline for processing “orphans”.


3.2.1.15.5 Missing Materials Report and Inventory

The vendor will prepare a missing materials report for secure answer documents based on the scanning completed during materials check-in. Reports listing the number of missing answer documents, the identification of each document, and the names of schools, or counties, with these documents will be prepared and sent to the Agency. Any missing materials returned by counties will be recorded in the missing materials inventory maintained by the vendor. The missing materials reports must be delivered to the counties and the Agency 30 days after the check-in of secure materials has been completed. For each administration, check-in and verification of secure materials must be completed prior to the first shipment of results to counties. The vendor will deliver a final summary report of missing materials to the Agency.



3.2.1.15.6 Store/Retrieve Paper Answer Documents and Test Books

The vendor must ship to the Agency quantities of these materials as may be required. In some cases, retrieval of answer documents will require the vendor to conduct a manual verification of machine scoring. The vendor will be responsible for costs associated with retrieval and possible delivery of these materials to the Agency. In addition, when errors are found, the vendor may be required to re-score and re-report these documents.


3.2.1.15.7 Disposition of Paper Materials

Upon verification of the individual test booklet identification numbers of all test answer sheets returned by the counties and acceptance by the Agency of accurate results files, the vendor will inventory and store unused paper test books, manuals, and other materials for a period of twelve months. After acceptance by the Agency of accurate computer files, used answer documents must be stored for the life of the program.


Unused test books and answer documents may be destroyed after twelve months with written approval from the Agency. However, the vendor will store 100 copies of each subject/grade test and answer book for each administration throughout the life of the contract. Any materials that may be used in subsequent assessments will be stored by the vendor.
At the end of the program, the vendor will ship, or destroy, the answer documents according to instructions from the Agency. This destruction will be initiated by a letter from the vendor to the Agency requesting permission to destroy specific materials. Test security requirements will be maintained throughout the destruction process. Destruction of secure documents must be requested in writing and authorized by the Agency. Further, the vendor must submit a certification of destruction that describes in writing the specific items destroyed. If it is necessary to retain answer documents for longer time periods than previously outlined, the Agency will use additional funds to pay for the storage, or request the documents be transferred to the Agency for storage.
3.2.1.16 Processing and Scanning Verification Introduction

The vendor will design and implement the systems required to process and scan the results of student responses from each administration. The vendor will also develop procedures to verify the accuracy of data produced at each processing step. The vendor will work jointly with the Agency on finalizing the processing rules and data analyses shortly after award. The vendor should deliver proposed requirements based on discussions and past performance history to the Agency for approval.


Test processing will include receipt of the consumable test booklets, scanning multiple-choice test materials and ensuring the accuracy of data at each processing step. Additional data processing activities may include working with the state staff to edit accountability information and make corrections and additions to the data files as necessary.
3.2.1.16.1 Processing Specifications

The vendor must complete the development of all data processing, scanning, scoring, and reporting procedures prior to each test administration to ensure all procedures have been checked before the processing of student answer books begins. The vendor must monitor all aspects of the scanning and scoring procedures throughout the entire time actual answer books are being scanned and scored.


The vendor is responsible for developing processing and scanning verification specifications for each administration that describes in detail all the steps to be implemented to demonstrate to the Agency that the final reports of results are accurate. The vendor is responsible for revising the specifications and receiving Agency approval at least four months prior to each test administration. The components of the processing and scanning verification plan are as follows:

  • Verifying total quantities returned by schools and counties

  • Ensuring all pages are correctly ordered in consumable test booklets

  • Monitoring intensity levels read by each scanner

  • Monitoring reading of answers in the consumable test booklets, student bar codes, and other codes identifying the consumable test booklet

  • Developing guidelines for hand edits

  • Training staff to perform hand edits

  • Monitoring hand edits


3.2.1.16.2 Verify Document Receipt

The data verification plan will begin with inventorying the used consumable test booklets received. The vendor must compare the number of test booklets returned to the number on the header (test booklet count form) and compare the number returned to the number ordered by the school, or county.


To assist in the process, County Test Coordinators will document the destruction of any test booklet within a formal letter sent to the vendor and to the Agency. The information will include the booklet number and any other important identifying information and the reason for destruction. Secure test materials may not be destroyed without written permission of the Agency.
The Agency prefers an electronic method for accomplishing this task that would generate a report showing the differences between the pre-identification file n-counts and the inventory of scanned returned test booklets by school within county. When a discrepancy is identified, the vendor will follow up and work with the county test coordinator and the Agency to resolve the discrepancy.
3.2.1.16.3 Scan Materials

Accurate scanning must be verified on each scanner used through the use of periodic recalibration procedures. Scanning must be monitored by the vendor between each scan run and each time a scanner is recalibrated. The vendor will provide a plan to identify what types of monitoring the contractor will be performing and what types of data will be presented to the Agency to verify the scanners are working properly through each scan run of actual scoring.


3.2.1.16.4 Materials Edited

Vendors shall provide a description of necessary editing of answer documents and headers which



  • contain double grids, or inaccurate gridding, of printed information

  • are coded incorrectly with respect to student, school, or county identification,

  • are deemed partially, or wholly, unscorable for some reason, or

  • other necessary material edits



3.2.1.16.5 Disaster Recovery Plan


The vendor shall provide a description of the plan to backup all systems, applications, and databases routinely to an onsite and offsite location. Additionally, the vendor shall detail the plan for data recovery in the event a disaster is declared where the data is maintained and stored. Database transaction logs should be archived and maintained online for 48 hours.

3.2.1.17 Scoring and Technology Introduction


The proposal should set forth and document the capabilities of the company to score West Virginia materials within the prescribed time limits; the multiple choice option must be scored in three weeks. The references in the proposal should be able to substantiate these capabilities. The Agency desires to implement scoring processes that are reliable and valid, as well as efficient in terms of time and expenditures.
Vendors must design and implement systems required to process, score, and report the results of student responses from each administration. Vendors must provide evidence of their ability to assign reliable and valid scores for the methods proposed and also provide a detailed description of how the security of the prompts and student responses will be maintained throughout scoring. Scoring accuracy is a key component in maintaining the quality and integrity of the program while meeting challenging scoring deadlines.
The specifics for scoring the WESTEST in grades K-2 Reading/Language Arts, Mathematics, Science, and Social Studies include:

  • technical services related to the production and interpretation of results

  • detailed plan to integrate WVEIS student information with the scoring system and exporting student scores and item response files back into the WVEIS files

  • use of WVEIS unique student identifier

  • detailed plan to create output files including assessment data items as required by NCLB and Individuals with Disabilities Education Improvement Act of 2004 (IDEA)

  • log-in process for assessment materials

  • conversions and statistical measure calculations process

  • schedule to customize (if needed) scoring software

  • platform on which the programs run


3.2.1.17.1 Develop Specifications

For each administration, the vendor shall provide the Agency with a detailed scoring and reporting plan that documents the accuracy of all scoring and report production programs. This plan should detail the process of quality reviews of data and the printed report output during report processing and live report production. The vendor will work jointly with the Agency on finalizing the scoring rules and data analysis shortly after award.


Scoring verification steps should include:

  • Developing procedures for the vendor and the Agency to independently verify the calibrating, scaling, and equating

  • Developing procedures for the vendor and the Agency to independently verify the subscores

  • Verifying all items are scored correctly

  • Verifying all aggregated scores are correctly rounded and reported

  • Arranging for actual consumable test booklets to be processed, scanned, scored, and all student, school, county, and state reports generated and proofed by the vendor and the Agency

  • Developing procedures and reports to identify duplicate student records within and across districts.

  • Detailed plan to create output files

  • Detailed plan to develop and implement a log-in/log-out process for assessment materials.

  • Schedule to customize (if needed) the scoring software


3.2.1.17.2 Verify Scoring

Necessary software will be created for the Agency and the vendor to independently verify the calibrating, scoring, and equating specified in Section 3.2.1.12.5. The vendor will provide all of the resources, including software, staff support, and data files to permit parallel calibration, scoring, and equating of data during the same time period they are producing operational scoring tables. Psychometric staff must be available for daily discussions and consultation throughout the parallel calibration periods.


The scoring keys will be entered into the content/form management system and will include the item identification number; item type; item location; and correct answers for all item formats. The vendor and the Agency will independently verify all answer keys used in the scoring of answer books. The vendor's procedures must provide for at least two people to separately verify each answer key, and such verification shall be based on individuals actually reading and answering each test question. The Braille forms of the test may require separate scoring keys and score scales. This verification will take place in an electronic environment.
The vendor must check the accuracy and consistency of all student level data on files before submitting the file to the Agency. This includes details such as; ensuring all codes used on the final file are valid; all item scores for item formats are scored accurately based on the students’ answers shown on the file; all raw scores are aggregated correctly, all student demographic information is coded correctly; etc. The vendor must provide a quality checks system of verifying scoring, trial runs before final report, etc. The vendor should provide evidence of their quality check systems and process.
3.2.1.17.3 Cheating Detection

Vendors must propose additional analyses to help the Agency ensure the scores reported for each student are a valid representation of that student’s abilities. Methods should be proposed to flag student and/or school results that appear anomalous based upon comparisons to previous test results, other test results, or other student, or school results.


Examples of analyses include, but are not limited to, erasure analyses, or cheater detection programs. Vendor’s proposals should provide a complete description of their proposed analysis and include an example of reports generated as a result of this analysis. Based on the reports, the Agency may require the vendor to void student test scores.
3.2.1.17.4 Report Verification

An independent verification of all files and each report generated by the vendor will be conducted by both the vendor and the Agency for a maximum of three counties. The vendor should allow for up to eight staff members from the Agency to check these at either the vendor’s office, or at a mutually agreeable site.


The final electronic files and a copy of each report must be delivered to the Agency for data checking at least seven days prior to the mailing of the student reports to counties. The Agency will have no fewer than five working days to approve any individual file. If errors are identified on the files, additional time may be required for Agency review. Proposals may include any additional strategies that the vendor would recommend for consideration. The vendor must be prepared to regenerate files when errors are identified. The Agency will provide approval of all files before reports are printed and shipped.
3.2.1.17.5 Scoring Sites

Scoring for WESTEST K-2 must be conducted at the vendor’s established scoring sites. The vendor’s proposal will identify the number and physical locations of proposed scoring sites and which subjects and grades the vendor intends to score at each site. The Agency reserves the right to approve scoring sites and the distribution of subject/grade level scoring across sites.


At the Agency’s option, observers may be allowed access to the scoring centers for brief periods of time for the purpose of generally understanding the process. A Agency official, or vendor staff designated by the Agency, will accompany such visitors.
3.2.1.17.6 Expedite Performance Scoring

As stated, the Agency requires that the multiple choice tests be scored within three weeks and reported/posted on the FTP site within two weeks. Vendors are expected to consider alternatives to make it possible for the statewide assessments to be processed according to a timeline shorter than the one specified. The Agency must approve any alternatives the vendor proposes.


3.2.1.17.7 Overall Scoring Quality Control

The vendor shall provide for quality control systems to verify the accuracy of the scoring, processing, and reporting of all test scores. In addition, the vendor will provide the results of these quality control reviews to the Agency so that the Agency can ensure any identified problems have been rectified. In addition, the Agency may operate separate quality control operations. In so doing, the Agency may utilize the services of one, or more, vendors to assist in verification of the quality and accuracy of the statewide assessment results. These vendors will work under the direction of the Agency and will perform data verification checks at times and places so designated. The vendor will be obligated to provide data, information, explanations, and work space, if necessary, for data verification vendors working with the Agency.


The objective of the quality control processes is to triangulate analyses and to verify the data being reported are correct. The Agency will review all quality control findings and will provide permission for the vendor to prepare and distribute test results.
3.2.1.17.8 FTP Site

Vendors must include provisions for a secure File Transfer Procedure (FTP) site for data transfer from the vendor to the Agency and from the Agency to the vendor. This site must be maintained by the vendor to ensure its continued availability throughout the life of the program.



3.2.1.18 Reporting

The Agency desires easy-to-understand reports that are creative, attractive and technically defensible. Vendors should present innovative report designs that take advantage of current technologies for color printing and data merging. Reports should look similar to the WESTEST reports (See http://westest.k12.wv.us/reports.htm). The reports must provide numeric, verbal, and graphic presentations of assessment results that effectively communicate with intended audiences, including students, teachers, parents, and the general public.


At a minimum, the vendor must supply the reports listed in the Report Descriptions (See Section 3.2.1.18.1). The proposal should document the capabilities of the company to fulfill West Virginia reports requirements within the prescribed time limits. Vendor procedures and report guidelines are found in this section. The vendor’s references (See Section 3.2.4.14) must be able to substantiate these capabilities.
The vendor shall provide for committee review a detailed scoring and reporting plan that documents the accuracy of all scoring and report production programs. This plan must detail the process of quality reviews of data and the printed report output during report processing and live online report production.
Minor adjustments to the reports should be anticipated by the successful vendor. In responding to the reporting requirements, vendors are encouraged to suggest combinations of report formats or innovative graphic or numeric displays.
Section 3.2.1.18.1 Reports Descriptions

The vendor's proposed timeline for completing the proposed services must respect the Spring 2008 field test and 2009 operational test administration dates. All student reports are to be separated before being shipped to counties, or to the Agency.


In addition, the vendor will be prepared to process missing, or erroneous, reports throughout the duration of the contract. Copies of data files for each test administration shall be maintained throughout the duration of the contract. Three distribution levels will be specified - school, county, and state. Student reports will be shipped to the County Test Coordinator at each county office.
The WESTEST Grades K-2 reports for the program must at a minimum include:
1. STUDENT REPORT:

The individual Student Report will at a minimum include the following:



    • Student name, grade, date of birth, WVEIS #, class, school, county, state, and explanatory information about the scores

    • Total summative score and performance level

      • Writing performance, based on the writing prompt, for all students of the same grade level must be reliability combined into each student’s Reading/Language Arts score

    • Definition of terms

    • Performance level descriptors

      • One side of student report will capture the performance level descriptors

    • Lexile/Quantile scores


2. INDIVIDUAL ITEM ANALYSIS (Per Student):

The Individual Item Analysis shall at a minimum provide for each student



  • Listing of all content standards and objectives for all items on the test

    • All objectives must be clustered under appropriate standards

  • Indication response to the item is correct, indication response to the item is not correct for multiple choice items

  • Indication of the number of points possible, points received by student for constructed response items

  • Thinking skills levels by item/definition of thinking skills

  • Definitions for all unfamiliar terms on the report


3. ITEM ANALYSIS SUMMARY BY SUBGROUP REPORT (School, County and State)

The Item Analysis Summary by subgroup report shall be organized by grade by school, by grade by county, and by county by state and shall include at a minimum the following:



  • Results disaggregated by All, low Socio-economic Status, Special Education, Black, White, Hispanic, Asian/Pacific, Native American/Alaskan, Limited English Proficient, for grades K – 11

  • Content area, item number, CSO number and description

    • All objectives must be clustered by standards for each content area

  • Percent of students with item correct, percent of students with each score point, number of points possible, thinking skills levels by item

  • Definition of thinking skill levels

  • Definition of all unfamiliar terms on the report


4. CONFIDENTIAL ROSTER REPORT (ALPHA Level by Grade for School Level Only):

The Confidential Roster Report shall be organized by grade by school and by grade by county and shall include the following:



  • Student names in alphabetical order by grade (last name, first name, middle initial), grade, date of birth, WVEIS number, school, county, test date, prompt type

  • Student scale scores for all content areas and performance levels, lexiles and quantiles

  • All content areas should have assessment scores and performance levels

    • Show combined Reading/Language Arts/Writing Assessment scores and separate Writing score and performance level


5. CONFIDENTIAL SUMMARY REPORTS (School, County and State):

The West Virginia Confidential Summary Report shall be prepared for all schools, counties, and state. They shall contain a graph of the percent of students who attained each performance level category. The report shall also show the distribution of analytic trait scores for the group. The report shall be organized by grade by school, by grade by county, and by grade by state and shall at a minimum include the following:



  • Grade and test date

  • Number of students tested by content areas and subgroups (all, gender, race/ethnicity, students with disabilities, Limited English Proficient students, migrant, economically disadvantaged)

  • Performance levels by aggregate number and percent of students at each performance level

  • Mean scale scores and grade level mastery of the content area

  • Number of students tested by content standard, grade level mastery of content standards, and mean percent correct by content standard

  • Definition of all unfamiliar terms on the report

  • Writing performance levels for all students of the same grade level must be reliability combined into the Reading/Language Arts score

  • Number of students tested by content standard, grade level mastery of content standards, and mean percent correct

  • Definition of mean percent correct

Upon all corrections being made with the vendor and the Agency, the vendor will supply

1) all corrected reports to Agency via an FTP site within eight weeks of receipt and scoring of assessments and

2) the aggregate performance, as per the guidelines of this Confidential Summary Report, of all grade levels within the each school, each county and the state.


6. GENERAL RESEARCH FILE:

A report shall be programmed and made available to provide electronic data for preparing accountability reports. This file shall be organized by school, grade, county, and state, subgroups, and shall agree with the data reported on summary lines in the school, county, and state level reports.


7. STUDENT LABEL

    • The Student Label will at a minimum include the following:

      • Student name, grade, school, test date, gender, date of birth, WVEIS number, content area, scale score, performance level, prompt type by grade by school

      • Self-adhesive to allow attachment to the student record


8. ELECTRONIC DATABASE PROGRAM OF STUDENT PERFORMANCE

    • Individual Student Performance with a re-rostering option for teachers

      • Reports by subgroup to include Title I and Special Education

and all other federally required subgroups

      • Capability of producing longitudinal data reports

      • Program should be customized to address the reports required in this proposal

      • Product may be software, or Web-based

      • Product should be compatible with software programs

This section requires the vendor to provide costs for placing all reports except the student report on a secure FTP site for local school districts for electronic retrieval and printing. All individual student reports are to be printed by the vendor and distributed to County Test Coordinators for dissemination to local schools. The vendor must supply two copies of each individual student report to the local school district.


Note: Any proprietary software required (along with all software support) to read the data must be included for the Agency and updated throughout the contract. Vendors are to full describe this software. If additional copies will be required at the county level, pricing for this must be included for the life of the contract.
This section requires the vendor to provide costs for placing all reports except the student report on a secure FTP site for local school districts for electronic retrieval and printing. All individual student reports are to be printed by the vendor and distributed to county test coordinators for dissemination to local schools. The vendor must supply two copies of each individual student report to the local school district.
3.2.1.18.2 Develop Specifications

The vendor is responsible for developing specifications for each administration that describes in detail all the steps to be implemented to demonstrate to the Agency the final reports of results are accurate. The vendor is expected to incorporate the procedural, design, and implementation requirements for reporting tasks into written specifications initially developed by September 2007 for the Spring 2008 field test administrations. The vendor should produce final specifications and mockups of proposed report forms for each subsequent administration within a similar time frame.


The vendor is responsible for drafting specifications for each report that include:

  • a description of the report

  • how the data on each report are generated (i.e., which population of students)

  • in which shipment the report is included

  • who receives the report with the number of copies received

  • a sample of the report

This plan should detail the process of electronic quality reviews for the data and the printed report output during report processing and live report production. The vendor will work jointly with Agency on finalizing the scoring rules and final reporting considerations shortly after award.


3.2.1.18.3 Report Development

The units of analysis for inclusion in West Virginia reports are the student, class, school, county, and state. At the school, county, and state levels, reports will include subgroup results (economically disadvantaged students, students with disabilities, students with limited English proficiency (Limited English Proficient), major racial and ethnic groups and gender) Special education reports will also be provided at the school, county, and state levels.


Test results will be reported by scale score and achievement level. Subtest results will be reported by standard and subtests when appropriate (e.g., usage/mechanics, rhetorical skills, pre-algebra and geometry). Also, test results by scale score and achievement level for the previously mentioned subgroups will be reported. The tests will be placed on a vertical scale, but percentile scores are not reported.
The Lexile Framework® for Reading is a scientific approach to measuring reading ability and reading materials. A Lexile measure represents both the difficulty of a text, such as a book or article, and an individual’s reading ability. The Lexile scale is a developmental scale for measuring reader ability and text difficulty ranging from below 200L for beginning readers and beginning-reader materials to above 1700L for advanced readers and materials. Knowing the Lexile measure of a reader and the Lexile measure of a text helps to predict how the text matches the reader’s ability—whether the text may be too easy, too difficult or appropriate. All Lexile products, tools and services rely on the Lexile measure and the Lexile scale to match reader and text.
The Lexile measure should be linked to the reading or reading comprehension scale score. For instructional purposes, the Lexile measure should be reported at the student and classroom levels. Lexile measures should appear on reports that are sent home to parents and reports that are provided to the current grade-level teacher and/or the next grade-level teacher.
The Quantile Framework® for Mathematics is a scientific approach to measuring mathematical achievement and concept/application solvability. A Quantile measure represents the difficulty of a mathematical skill, concept or application (called a QTaxon) and a developing mathematician’s mastery of the QTaxons in the areas of geometry, measurement, numbers and operations, algebra, and data analysis and probability. The Quantile Framework spans the developmental continuum from kindergarten Mathematics through the content typically taught in Algebra II, Geometry, Trigonometry and Pre-calculus, from below 0Q (Emerging Mathematician) to above 1400Q. Quantile measures take the guesswork out of determining which mathematical skills a developing mathematician has mastered and which ones require further instruction.
The Quantile measure should be linked to the Mathematics, Mathematics application, or problem solving scale score. For instructional purposes, the Quantile measure should be reported at the student and classroom levels. Quantile measures should appear on reports that are sent home to parents and reports that are provided to the current grade-level teacher and/or the next grade-level teacher.
The vendor and the Agency will extensively review all data files before they are used to produce live reports. The vendor must produce a live data file with a sample population composed of three counties selected by the Agency. This file will be used to check student-level and aggregated data for each content-area test at each grade. Each phase of reports will be created from this live data check file; both the file and reports must be sent to the Agency for verification and approval.
The Agency will review the data and draft reports, and will work with the vendor to resolve any questions. The Agency expects the vendor to conduct an extensive quality check before the file and final reports are sent to the Agency.
3.2.1.18.4 Update Report Designs

The vendor is responsible for annually reviewing and updating the design of the individual student, school, county, and state reports of test results in consultation with the Agency. Though it is expected report formats will not change extensively from year to year, the vendor must after each administration, pursue reporting requirements from the Agency and make any changes required by Agency until final approval is given. No extra cost must be charged to Agency.


3.2.1.18.5 Report Delivery

During each administration, numerous reports and data files are provided for students, parents, schools, counties, the state, and the general public with data aggregated in various ways. The actual reports and data files to be generated are described in Section 3.2.1.18.1. The vendor must prepare the data files using formats approved by the Agency.


In addition to reports of results, there are also additional reports, including, missing secure materials reports, etc. Requirements are established for many reports to be available as electronic files in formats compliant with Section 508 of the Rehabilitation Act (Refer to http://www.section508.gov/) and to allow the files to be both viewed on a website or, to be viewed as a downloaded file.
3.2.1.18.5 .1 Report Phases/Timelines

The vendor will not provide individual student data, or reports, for the field test administration.


Following vendor quality checks, reports for the spring test administration should be delivered to the Agency and counties via secure FTP within five weeks after the vendor receives the tests. All student reports should be shipped to local counties via hardcopy within five weeks after the vendor receives the tests.
All student reports for schools are to be packaged by school name, but sent to the County Test Coordinators. All printed products will be proofed by the vendor and copies will be sent to the Agency for proofing and approval prior to mailing any product to the counties. The Agency must review each report before shipment or release of any reports. All printed student reports must be original laser copies.
The Agency reserves the right to request some records be removed from processing until specific issues are resolved. These issues include duplicate records, records with blank WVEIS student identification numbers and/or blank names, schools or students whose test records are under investigation for possible cheating, or other issues that might affect school totals. The issues regarding the suppressed records will be dealt with as soon as possible after reporting is completed.
The vendor may be requested to change the score reported flag on the file to one that would not report the student’s score, pull test documents to resolve duplicate tester issues, add a corrected Student Identification Number, or corrected name to a record, produce Individual Student Reports (as directed), and/or School Lists of Students. The vendor will work with the Agency to establish a timeline for the processing and reporting of these records.
3.2.1.18.5.2 Electronic Files

The vendor will supply the Agency along with electronic reports an electronic file, in a format approved by the Agency, containing individual student data aggregated by grade and subject for each school, county, and the state. These electronic records will agree with the data reported on summary lines in the county, state, and school level reports. Additional summary statistics for each school, county and the state will be reported by disaggregated characteristics such as racial/ethnic group, gender and other demographic information. Every summary statistic represented in the reports should be represented in this file.


The vendor will be responsible for checking to ensure all files are consistent and accurately reflect the data provided on the reports. The Agency will independently verify the consistency and accuracy of the data files.
3.2.1.19 Optional Reporting Services

The Agency requests that each vendor provide costs/quotes for any optional services, enhancements or projected updates in the proposal. Please provide information about product efficiency, usability, expanded reporting capabilities and costing. These optional services must be available upon request by the Agency.





Download 5.9 Mb.

Share with your friends:
1   2   3   4   5   6   7   8   9   ...   38




The database is protected by copyright ©sckool.org 2020
send message

    Main page