Request for Proposal (rfp)



Download 13,32 Mb.
Page1/38
Date conversion09.08.2018
Size13,32 Mb.
  1   2   3   4   5   6   7   8   9   ...   38
West Virginia Department of Education

Request for Proposal

(RFP)

Statewide Assessment Program

Office of Assessment and Accountability

Division of Curriculum and Instructional Services

Agency: Division of School Improvement, Office of Assessment and Accountability for RFP # EDD265467
Part 1 GENERAL INFORMATION/TERMS AND CONDITIONS
1.1 Purpose 1

1.2 Project 1

1.3 RFP Format 1

1.4 Inquiries 1

1.5 Vendor Registration 2

1.6 Oral Statements and Commitments 2

1.7 Economy of Preparation 2

1.8 Labeling of RFP Sections 2

1.8.1 Mandatory Requirements 2

1.8.2 Contract Terms and Conditions 2

1.8.3 Informational Sections 2

1.9 Proposal Format and Submission 2

1.9.1 2

1.9.2 3


1.9.3 3

1.9.4 Best Value Purchasing Standard Format 3

1.9.4.1 Evaluation Criteria 3

1.9.4.2 Proposal Format and Content 3

1.9.4.3 Technical Bid Opening 3

1.9.4.4 Technical Evaluation 3

1.9.4.5 Cost Bid Opening 4

1.9.4.6 Cost Evaluation and Resident Vendor Preference 4

1.9.4.7 Contract Approval and Award 4

1.10 Rejection of Proposals 4

1.11 Incurring Costs 4

1.12 Addenda 4

1.13 Independent Price Determination 5

1.14 Price Quotations 5

1.15 Public Record 5

1.15.1 Submissions are Public Record 5

1.15.2 Written Release of Information 5

1.15.3 Risk of Disclosure 5

1.16 Schedule of Events 5

1.17 Mandatory Prebid Conference 6

1.18 Affidavit 6

1.19 General Terms and Conditions 6

1.19.1 Conflict of Interest 6

1.19.2 Prohibition Against Gratuities 6

1.19.3 Certifications Related to Lobbying 6

1.19.4 Vendor Relationship 7

1.19.5 Indemnification 7

1.19.6 Contract Provisions 7

1.19.7 Governing Law 8

1.19.8 Compliance with Laws and Regulations 8

1.19.9 Subcontracts/Joint Ventures 8

1.19.10 Term of Contract & Renewals 9

1.19.11 Non-Appropriation of Funds 9

1.19.12 Contract Termination 9

1.19.13 Changes 9

1.19.14 Invoices, Progress Payments, & Retainage 10

1.19.15 Liquidated Damages 11

1.19.15.1 Project Schedule Liquidated Damages 11

1.19.15.2 Project Scoring Errors 11

1.19.15.3 Loss/Destruction of Materials 11

1.19.16 Record Retention (Access & confidentiality) 11
PART 2 OPERATING ENVIRONMENT

2.1 Location 13

2.2 Background 13

Table 1 Projected Summative Assessment for 2008-2014 13

2.2.1 Current Technology Structure/Design 14

PART 3 PROCUREMENT SPECIFICATIONS

3.1 General Requirements 16

3.1.1 Current Technology Structure/Design 17

Table 2 Requirements for Students Computers 17

3.1.2 Assessment Alignment (Using the Dr. Norman Webb Model) 18

3.1.3 Grades K-12 19

3.2.1 Grades K-2 Introduction 23

3.2.1.1 Test Construction Specifics 23

3.2.1.2 Blueprint/Item Specifications 24

3.2.1.3 Item Development/Selection Process 24

3.2.1.4 Universal Design Principles 25

3.2.1.5 Reviews 26

3.2.1.5.1 Content Review Meetings 26

3.2.1.5.2 Bias Reviews 26

3.2.1.5.3 Face-to-Face Item Selection Final Review 26

3.2.1.6 Form Development Process 27

3.2.1.6.1 Field Test Development/Operational 27

Table 3 Grade K-2 Field Test Design – 2007-2008 Multiple Choice Item Counts 27

Table 4 Grade K-2 Operational Test Design – 2009-2014 Multiple Choice Items

Counts 27

3.2.1.6.2 Field Test Form Development 29

Table 5 WESTEST Grades K-2 West Virginia Materials Production Specifications

for Field Test of NRT and Augmented Items 29

3.2.1.7 Ancillary Product Development 30

Table 6 WESTEST Grades K-2 West Virginia Materials Production Specifications

for Operational Test of NRT and Augmented Items 30

3.2.1.8 Art and Production 32

3.2.1.9 Accommodations 32

3.2.1.9.1 Braille 32

3.2.1.9.2 Large Print 33

3.2.1.10 Content/Form Management System 33

3.2.1.10.1 Introduction 33

3.2.1.10.2 Descriptive Information for Content/Form Management System 34

3.2.1.10.3 Psychometric Information for Content/Form Management System 34

3.2.1.10.4 Software/Hardware Concerns for Content/Form Management System 35

3.2.1.10.5 Optional Content/Form Management System Tasks 35

3.2.1.10.6 Copyright Issues 35

3.2.1.10.7 Examples of Items 35

3.2.1.10.8 Examples of Page Layouts 36

3.2.1.11 Item Alignment to the 21st Century WV CSOs 36

3.2.1.12 Psychometric Research and Technical Services 36

3.2.1.12.1 Introduction 36

3.2.1.12.2 Descriptive Statistics 37

3.2.1.12.3 Validity 37

3.2.1.12.4 Reliability 38

3.2.1.12.5 Scale Scores - Calibration Scaling and Equating Procedures 38

3.2.1.12.5.1 Statistical Software 39

3.2.1.12.6 Vertical Scaling 39

3.2.1.12.7 Standard Setting 39

3.2.1.12.8 Statistical Analyses for Special Populations and Purposes Other Studies 41

3.2.1.12.9 External Quality Control 41

3.2.1.12.10 Psychometric Support 41

3.2.1.13 Technical Reporting 41

3.2.1.13.1 Analyses Reports 41

3.2.1.13.2 Final Technical Reports/Documents 41

3.2.1.14 Materials Production 43

3.2.1.14.1 Test Booklets 43

3.2.1.14.1.1 Custom Covers 44

3.2.1.14.2 Consumable Test Booklets 44

3.2.1.14.3 Test Examiner’s Manual 44

3.2.1.14.4 County Test Coordinator’s Manual 45

3.2.1.14.5 Other Ancillary Documents 45

3.2.1.14.6 Braille and Large Print Documents 45

3.2.1.14.7 Materials Distribution/Retrieval 46

3.2.1.14.8 Pre-identification of Answer Documents 46

3.2.1.14.8.1 Optional Technology Systems 47

3.2.1.15 Packaging Specifications 47

3.2.1.15.1 Pack and Distribute Materials 47

3.2.1.15.2 Quantities 47

Table 7 Number of Public School/Private Parochial Students Based on 2006-2007

Enrollment Figures for Grades K-2 with a Built-in 5% Overage 47

3.2.1.15.3 List of County Test Coordinators 47

3.2.1.15.4 Packing, Distributing, and Receiving Materials Provisions 48

3.2.1.15.5 Missing Materials Report and Inventory 50

3.2.1.15.6 Store/Retrieve Paper Answer Documents and Test Books 50

3.2.1.15.7 Disposition of Paper Material 50

3.2.1.16 Processing and Scanning Verification Introduction 50

3.2.1.16.1 Processing Specifications 51

3.2.1.16.2 Verify Document Receipt 51

3.2.1.16.3 Scan Materials 51

3.2.1.16.4 Materials Edited 52

3.2.1.16.5 Disaster Recovery Plan 52

3.2.1.17 Scoring and Technology Introduction 52

3.2.1.17.1 Develop Specifications 53

3.2.1.17.2 Verify Scoring 53

3.2.1.17.3 Cheating Detection 54

3.2.1.17.4 Report Verification 54

3.2.1.17.5 Scoring Sites 54

3.2.1.17.6 Expedite Performance Scoring 54

3.2.1.17.7 Overall Scoring Quality Control 54

3.2.1.17.8 FTP Site 55

3.2.1.18 Reporting 55

3.2.1.18.1 Reports Descriptions 55

3.2.1.18.2 Develop Specifications 58

3.2.1.18.3 Report Development 58

3.2.1.18.4 Update Report Designs 60

3.2.1.18.5 Report Delivery 60

3.2.1.18.5.1 Report Phases/Timelines 60

3.2.1.18.5.2 Electronic Files 61

3.2.1.19 Optional Reporting Services 61

3.2.1.20 Disaster Recovery Plan 61

3.2.1.21 Data Management 61

3.2.1.22 Disposal/Final Delivery/Destruction of Materials 62

3.2.2.1 GRADES 3-11 Introduction 64

3.2.2.1.1 Test Construction Specifics 64

3.2.2.1.2 Blueprint/Item Specifications 65

3.2.2.1.3 Item Specifications 65

Table 8 Grades 3-11 FIELD TEST DESIGN – 2007-2008 Option 1 with Customize6

Constructed Response Items 66

Table 9 Grades 3-11 OPERATIONAL DESIGN 2009-2014 Option 1 with Customized

Constructed Response Items 66

Table 10 Grades 3-11 Field Test Design - 2007-2008 Option 2 without Customized

Constructed Response Items 67

Table 11 Grades 3-11 Operational Test Design - 2009-2014 Option 2 without Customized

Constructed Response Items 67

3.2.2.1.4 Item Development/Selection Process 67

3.2.2.1.4.1 Universal Design Principles 68

3.2.2.1.5 Reviews 69

3.2.2.1.5.1 Content Review Meetings 69

3.2.2.1.5.2 Bias Reviews 69

3.2.2.1.5.3 Face-to-Face Item Selection Final Review 70

3.2.2.1.6 Form Development Process 70

3.2.2.1.6.1 Field Test Development 70

3.2.2.1.6.2 Operational Form Development 70

3.2.2.1.6.3 Ancillary Product Development 71

Table 12 WESTEST Grades 3-11 West Virginia Materials Production Specifications

for the 2008 Field Test of NRT and Augmented Items Assessment 71

Table 13 WESTEST Grades 3-11 West Virginia Materials Production Specifications

for the 2009 Operational of Supplemental Materials 73

3.2.2.1.6.4 Art and Production 75

3.2.2.1.7 Accommodations 75

3.2.2.1.7.1 Braille 75

3.2.2.1.7.2 Large Print 76

3.2.2.1.8 Online Pilot Testing Option 76

3.2.2.1.9 Content/Form Management System 76

3.2.2.1.9.1 Introduction. 76

3.2.2.1.9.2 Descriptive Information for Content/Form Management System 77

3.2.2.1.9.3 Psychometric Information for Content/Form Management System 78

3.2.2.1.9.4 Software/Hardware Concerns for Content/Form Management System 78

3.2.2.1.9.5 Optional Content/Form Management System Tasks 78

3.2.2.1.10 Copyright Issues 78

3.2.2.1.11 Examples of Items 79

3.2.2.1.12 Examples of Page Layouts 79

3.2.2.1.13 Item Alignment to the 21st Century WV CSOs 79

3.2.2.1.14 Psychometric Research and Technical Services 79

3.2.2.1.14.1 Introduction 79

3.2.2.1.14.2 Descriptive Statistics 80

3.2.2.1.14.3 Validity 81

3.2.2.1.14.4 Reliability 81

3.2.2.1.14.5 Scale Scores - Calibration Scaling and Equating Procedures 82

3.2.2.1.14.6 Statistical Software 83

3.2.2.1.14.7 Vertical Scaling 83

3.2.2.1.14.8 Standard Setting 83

3.2.2.1.15 Statistical Analyses for Special Populations and Other Studies 83

3.2.2.1.16 External Quality Control 83

3.2.2.1.17 Psychometric Support 83

3.2.2.1.18 Technical Reporting 86

3.2.2.1.18.1 Analyses Reports 86

3.2.2.1.18.2 Final Technical Reports/Documents 86

3.2.2.1.19 Materials Production 87

3.2.2.1.19.1 Test Booklets 87

3.2.2.1.19.1.1 Custom Covers 88

3.2.2.1.19.2 Answer Documents 88

3.2.2.1.19.3 Test Examiner’s Manual 89

3.2.2.1.19.4 County Test Coordinator’s Manual 89

3.2.2.1.19.5 Other Ancillary Documents 89

3.2.2.1.19.6 Braille and Large Print Documents 90

3.2.2.1.19.7 Electronic and Paper Based Breach Forms 90

3.2.2.1.19.8 Materials Distribution/Retrieval 90

3.2.2.1.19.9 Pre-identification of Answer Documents 91

3.2.2.1.19.10 Optional Technology Systems 91

3.2.2.1.19.11 Pack and Distribute Materials 91

3.2.2.1.19.11.1 Packaging Specifications 91

3.2.2.1.19.11.2 Quantities 92

Table 14 Number of Public School/Private School Students Based on 2006-2007 Enrollment Figures for Grades 3-11 With a Built-in 5% Overage 92

3.2.2.1.19.12 List of County Test Coordinators 92

3.2.2.1.19.13 Packing, Distributing, and Receiving Materials Provisions 92

3.2.2.1.19.13.1 Missing Materials Report and Inventory 94

3.2.2.1.19.14 Store/Retrieve Paper Answer Documents and Test Books 94

3.2.2.1.19.15 Disposition of Paper Materials 95

3.2.2.1.20 Processing and Scanning Verification Introduction 95

3.2.2.1.20.1 Processing Specifications 95

3.2.2.1.20.2 Verify Document Receipt 96

3.2.2.1.20.3 Scan Materials 96

3.2.2.1.20.4 Materials Edited 96

3.2.2.1.20.5 Disaster Recovery Plan 97

3.2.2.1.21 Scoring and Technology Introduction 97

3.2.2.1.21.1 Develop Specifications 98

3.2.2.1.21.2 Verify Scoring 98

3.2.2.1.21.3 Cheating Detection 99

3.2.2.1.21.4 Report Verification 99

3.2.2.1.21.5 Scoring Sites 99

3.2.2.1.21.5.1 Expedite Performance Scoring 100

3.2.2.1.21.5.2 Overall Scoring Quality Control 100

3.2.2.1.21.5.3 Handscoring Constructed Response Tasks and Essays Introduction

(Option #2) 100

3.2.2.1.21.5.4 Produce Handscoring Specifications 101

3.2.2.1.21.5.5 Conduct Performance Scoring Operations (Option #2) 101

3.2.2.1.21.5.6 Conduct Rangefinder Review Meetings 101

3.2.2.1.21.5.7 Conduct Rangefinder Selection Meetings 102

3.2.2.1.21.5.8 Produce Scoring Materials 102

3.2.2.1.21.5.9 Handscoring Reports 102

3.2.2.1.21.5.10 Scoring Student Responses 103

3.2.2.1.21.5.11 Monitor and Maintain Handscoring Quality 103

3.2.2.1.21.5.12 Handscoring Personnel 104

3.2.2.1.21.5.13 Scoring Directors for Handscoring 104

3.2.2.1.21.5.14 Team Scoring Leaders 104

3.2.2.1.21.5.15 Recruit and Hire Readers 104

3.2.2.1.21.5.16 Training and Qualifying of Readers 105

3.2.2.1.22 Online Assessment Technology 105

3.2.2.1.22.1 Technology Requirements 107

3.2.2.1.22.2 Security 107

3.2.2.1.22.3 Reporting 108

3.2.2.1.22.4 Disaster Recovery Plan 108

3.2.2.1.22.5 FTP Site 108

3.2.2.1.23 Reporting 108

3.2.2.1.23.1 Reports Descriptions 109

3.2.2.1.23.2 Develop Specifications 111

3.2.2.1.23.3 Report Development 112

3.2.2.1.23.4 Update Report Designs 113

3.2.2.1.24 Report Delivery 113

3.2.2.1.24.1 Report Timelines 113

3.2.2.1.24.2 Electronic Records 114

3.2.2.1.24.3 Optional Reporting Services 114

3.2.2.1.25 Data Management 114

3.2.2.1.26 Disposal/Final Delivery/Destruction of Materials 115

3.2.2.1.27 Transition Plan 115

3.2.2.1.27.1 Transition Activities 116

3.2.2.1.27.2 Transitional Meetings 117

Table 15 Transition Meetings 117

3.2.2.1.27.3 Transition Psychometric Issues 117

3.2.2.1.27.4 Ownership 117

3.2.2.2 Alternate Performance Task Assessment for Grades 3-8 and 11

Introduction 119

3.2.2.2.1 Item/Form Development Introduction 120

3.2.2.2.1.1 Test Construction/Design Specifics 120

3.2.2.2.1.2 Blueprint/Item Specifications 121

Table 16 Grades 3- 8 and 11 Field Test Design – 2007-2008 121

Table 17 Grades 3-8 and 11 Operational Test Design – 2008-2009 122

3.2.2.2.1.3 Item Specifications 122

3.2.2.2.1.4 Item Development/Selection Process 122

3.2.2.2.2 Universal Design Principles 123

3.2.2.2.3 Reviews 123

3.2.2.2.3.1 Content Review Meetings 124

3.2.2.2.3.2 Bias Reviews 124

3.2.2.2.3.3 Face-to-Face Review 124

3.2.2.2.4 Form Development Process 124

3.2.2.2.4.1 Field Test Development 124

3.2.2.2.4.2 Operational Form Development 125

3.2.2.2.5 Ancillary Product Development 125

Table 18 APTA Grades 3-11 West Virginia Materials Production Specifications 2008

Field Test of APTA Items 126

Table 19 APTA Grades 3-11 West Virginia Materials Production Specifications 2008

Operational Test 127

3.2.2.2.6 Art and Production 128

3.2.2.2.7 Accommodations 128

3.2.2.2.7.1 Braille 128

3.2.2.2.8 Content/Form Management System 129

3.2.2.2.8.1 Introduction 129

3.2.2.2.8.2 Descriptive Information for Content/Form Management System 130

3.2.2.2.8.3 Psychometric Information Descriptive Information for Content/Form

Management System 130

3.2.2.2.8.4 Software/Hardware Concerns 131

3.2.2.2.8.5 Optional Content/Form Management System Tasks 131

3.2.2.2.9 Copyright Issues 131

3.2.2.2.10 Examples of Items 131

3.2.2.2.11 Examples of Page Layouts 131

3.2.2.2.12 Item Alignment to the Alternate Academic Achievement Standards for

West Virginia Schools 131

3.2.2.2.13 Psychometric Research and Technical Services Introduction 132

3.2.2.2.13.1 Descriptive Statistics 132

3.2.2.2.13.2 Validity 133

3.2.2.2.13.3 Reliability 133

3.2.2.2.13.4 Scale Scores - Calibration Scaling and Equating Procedures 134

3.2.2.2.13.4.1 Statistical Software 134

3.2.2.2.14 Vertical Scaling 134

3.2.2.2.15 Standard Setting 135

3.2.2.2.16 Statistical Analyses for Special Populations and Other Studies 135

3.2.2.2.17 External Quality Control 135

3.2.2.2.18 Psychometric Support 136

3.2.2.2.19 Technical Reporting 136

3.2.2.2.19.1 Analyses Reports 136

3.2.2.2.19.2 Final Technical Reports/Documents 136

3.2.2.2.20 Materials Production 137

3.2.2.2.20.1 Test Booklets 138

3.2.2.2.20.2 Custom Covers 139

3.2.2.2.20.3 Answer Documents 139

3.2.2.2.20.4 Test Examiner’s Manual 139

3.2.2.2.20.5 County Test Coordinator’s Manual 139

3.2.2.2.20.6 Other Ancillary Documents 139

3.2.2.2.20.7 Braille 140

3.2.2.2.20.8 Breach Forms 140

3.2.2.2.20.9 Materials Distribution/Retrieval 140

3.2.2.2.20.10 Scoring Materials: 140

3.2.2.2.21 Pre-identification of Scannable Consumable Test Booklets 141

3.2.2.2.22 Optional Technology Systems 141

3.2.2.2.23 Pack and Distribute Materials 141

3.2.2.2.23.1 Packaging Specifications 141

3.2.2.2.23.2 Quantities 142

Table 20 FIELD TEST DESIGN – 2007-2008 142

3.2.2.2.23.3 List of County Test Coordinators 142

3.2.2.2.23.4 Packing, Distributing, and Receiving Materials Provisions 142

3.2.2.2.23.5 Missing Materials Report and Inventory 144

3.2.2.2.23.6 Store/Retrieve Paper Scannable Consumable Test Booklets 144

3.2.2.2.23.7 Disposition of Paper Material 144

3.2.2.2.24 Processing and Scanning Booklets Verification Introduction 145

3.2.2.2.24.1 Processing Specifications 145

3.2.2.2.24.2 Verify Document Receipt 146

3.2.2.2.24.3 Scan Materials 146

3.2.2.2.24.4 Materials Edited 146

3.2.2.2.24.5 Disaster Recovery Plan 146

3.2.2.2.25 Scoring and Technology Introduction 146

3.2.2.2.26 Develop Specifications 148

3.2.2.2.26.1 Verify Scoring 148

3.2.2.2.26.2 Report Verification 148

3.2.2.2.26.3 Handscoring for APTA 149

3.2.2.2.26.4 Produce Handscoring Specifications 150

3.2.2.2.26.5 Conduct Performance Scoring Operations 150

3.2.2.2.26.6 Produce Scoring Materials 150

3.2.2.2.26.7 Handscoring Reports 150

3.2.2.2.26.8 Scoring Student Responses 150

3.2.2.2.26.9 Monitor and Maintain Hand Scoring Quality 150

3.2.2.2.26.10 Handscoring Personnel 151

3.2.2.2.26.11 Scoring/Reporting Project Director for Handscoring 151

3.2.2.2.26.12 Scoring Leaders 152

3.2.2.2.26.13 Recruit and Hire Readers 152

3.2.2.2.26.14 Training and Qualifying of Readers 152

3.2.2.2.26.15 Scoring Site 152

3.2.2.2.26.16 Expedite Performance Scoring 152

3.2.2.2.26.17 Overall Scoring Quality Control 153

3.2.2.2.27 Reports Descriptions 153

3.2.2.2.28 Sample Reports 156

3.2.2.2.29 Report Development 156

3.2.2.2.29.1 Update Report Designs 156

3.2.2.2.29.2 Report Delivery 156

3.2.2.2.29.3 Report Phases/Timelines 156

3.2.2.2.30 Electronic Records 157

3.2.2.2.31 Reports Descriptions/Timelines 157

3.2.2.2.32 Optional Reporting Services 157

3.2.2.2.33 Disaster Recovery Plan 158

3.2.2.2.34 Data Management 158

3.2.2.2.35 Transition Plan 158

Table 21 Transition Meeting Plan 159

3.2.2.2.35.1 Transition Psychometric Issues 159

3.2.2.3 Modified Assessment Grades 3-8 and 11 Introduction 160

3.2.2.3.1 Item/Form Development Introduction 161

3.2.2.3.1.1 Test Construction/Design Specifics 161

3.2.2.3.1.2 Blueprint/Item Specifications 162

Table 22 Grades 3-8 and 11 Field Test Design – 2007-2008 162

Table 23 Grades 3-8 and 11 Operational Test Design – 2008-2009 163

3.2.2.3.1.3 Item Specifications 163

3.2.2.3.1.4 Item Development/Selection Process 163

3.2.2.3.2 Universal Design Principles 164

3.2.2.3.3 Reviews 164

3.2.2.3.3.1 Content Review Meetings 165

3.2.2.3.3.2 Bias Reviews 165

3.2.2.3.3.3 Face-to-Face Review 165

3.2.2.3.4 Form Development Process 165

3.2.2.3.4.1 Field Test Development 165

3.2.2.3.4.2 Operational Form Development 166

3.2.2.3.5 Ancillary Product Development 166

Table 24 Modified Assessment Grades 3-11 West Virginia Materials Production

Specifications 2008 Field Test of Modified Items 167

Table 25 Modified Assessment Grades 3-11 West Virginia Materials Production

Specifications 2008 Operational Test 168

3.2.2.3.6 Art and Production 169

3.2.2.3.7 Accommodations 169

3.2.2.3.7.1 Braille and Large Print 169

3.2.2.3.8 Content/Form Management System 170

3.2.2.3.8.1 Introduction 170

3.2.2.3.8.2 Descriptive Information for Content/Form Management System 171

3.2.2.3.8.3 Psychometric Information Descriptive Information for Content/Form

Management System 171

3.2.2.3.8.4 Software/Hardware Concerns for Content/Form Management System 172

3.2.2.3.8.5 Optional Content/Form Management System Tasks 172

3.2.2.3.9 Copyright Issues 172

3.2.2.3.10 Examples of Items 172

3.2.2.3.11 Examples of Page Layouts 172

3.2.2.3.12 Item Alignment to the Modified Academic Achievement Standards for

West Virginia Schools 173

3.2.2.3.13 Psychometric Research and Technical Services Introduction 173

3.2.2.3.13.1 Descriptive Statistics 173

3.2.2.3.13.2 Validity 174

3.2.2.3.13.3 Reliability 174

3.2.2.3.13.4 Scale Scores - Calibration Scaling and Equating Procedures 175

3.2.2.3.13.4.1 Statistical Software 175

3.2.2.3.14 Vertical Scaling 175

3.2.2.3.15 Standard Setting 175

3.2.2.3.16 Statistical Analyses for Special Populations and Other Studies 176

3.2.2.3.17 External Quality Control 177

3.2.2.3.18 Psychometric Support 177

3.2.2.3.19 Technical Reporting 177

3.2.2.3.19.1 Analyses Reports 177

3.2.2.3.19.2 Final Technical Reports/Documents 177

3.2.2.3.20 Materials Production 179

3.2.2.3.20.1 Scannable Consumable Test Booklets 179

3.2.2.3.20.2 Custom Covers 180

3.2.2.3.20.3 Production of Scannable Consumable Test Booklets 180

3.2.2.3.20.4 Test Examiner’s Manual 180

3.2.2.3.20.5 County Test Coordinator’s Manual 181

3.2.2.3.20.6 Other Ancillary Documents 181

3.2.2.3.20.7 Braille and Large Print Test Documents 181

3.2.2.3.20.8 Breach Forms 182

3.2.2.3.20.9 Materials Distribution/Retrieval 182

3.2.2.3.20.10 Scoring Materials 182

3.2.2.3.21 Pre-identification of Scannable Consumable Test Booklets 183

3.2.2.3.22 Optional Technology Systems 183

3.2.2.3.23 Pack and Distribute Materials 183

3.2.2.3.23.1 Packaging Specifications 183

3.2.2.3.23.2 Quantities 183

Table 26 FIELD TEST DESIGN – 2007-2008 184

3.2.2.3.23.3 List of County Test Coordinators 184

3.2.2.3.23.4 Packing, Distributing, and Receiving Materials Provisions 184

3.2.2.3.23.5 Missing Materials Report and Inventory 186

3.2.2.3.23.6 Store/Retrieve Paper Scannable Consumable Test Booklets 186

3.2.2.3.23.7 Disposition of Paper Material 186

3.2.2.3.24 Processing and Scanning Booklets Verification Introduction 186

3.2.2.3.24.1 Processing Specifications 187

3.2.2.3.24.2 Verify Document Receipt 187

3.2.2.3.24.3 Scan Materials 188

3.2.2.3.24.4 Materials Edited 188

3.2.2.3.24.5 Disaster Recovery Plan 188

3.2.2.3.25 Scoring and Technology Introduction 188

3.2.2.3.26 Develop Specifications 189

3.2.2.3.26.1 Verify Scoring 189

3.2.2.3.26.2 Report Verification 190

3.2.2.3.26.3 Handscoring for Modified Assessment 190

3.2.2.3.26.4 Produce Handscoring Specifications 191

3.2.2.3.26.5 Conduct Performance Scoring Operations 191

3.2.2.3.26.6 Produce Scoring Materials 191

3.2.2.3.26.7 Handscoring Reports 191

3.2.2.3.26.8 Scoring Student Responses 192

3.2.2.3.26.9 Monitor and Maintain Hand Scoring Quality 192

3.2.2.3.26.10 Handscoring Personnel 193

3.2.2.3.26.11 Scoring/Reporting Project Director for Handscoring 193

3.2.2.3.26.12 Scoring Leaders 193

3.2.2.3.26.13 Recruit and Hire Readers 193

3.2.2.3.26.14 Training and Qualifying of Readers 194

3.2.2.3.26.15 Scoring Site 194

3.2.2.3.26.16 Expedite Performance Scoring 194

3.2.2.3.26.17 Overall Scoring Quality Control 194

3.2.2.3.27 Reports Descriptions 195

3.2.2.3.28 Sample Reports 197

3.2.2.3.29 Report Development 197

3.2.2.3.29.1 Update Report Designs 198

3.2.2.3.29.2 Report Delivery 198

3.2.2.3.29.3 Report Phases/Timelines 198

3.2.2.3.30 Electronic Records 199

3.2.2.3.31 Reports Descriptions/Timelines 199

3.2.2.3.32 Optional Reporting Services 199

3.2.2.3.33 Disaster Recovery Plan 199

3.2.2.3.34 Data Management 199



3.2.2.4 Writing Assessment 3-11 201

Table 27 Writing Assessment Options: (Vendors must bid on all options) 203

Table 28 2007 – 2008 Field Test Prompt/Passage Specifications 203

Table 29 Field Test Specifications -- Scoring of 2007-2008 Field Test Essays 205

3.2.2.4.1 Operational Tests 2008-2009 through 2013-2014 205

Table 30 Grades 3-11 – Operational Test Specifications 2008-2009 -- 2013-2014 205

3.2.2.4.1.1 Test Design 206

3.2.2.4.1.2 Field Test 207

Table 31 2007–2008 FIELD TEST ESSAYS 207

Table 32 SCORING OF 2007-2008 FIELD TEST ESSAYS 209

3.2.2.4.1.3 Operational Tests 2008-2009 through 2013-2014 209

Table 33 Grades 3-11 Operational Test – 2008-2009 Through 2013-2014 209

3.2.2.4.2 Prompt/Passage/Form Development 210

3.2.2.4.2.1 Test Construction Specifics 210

3.2.2.4.2.2 Test and Prompt/Passage Specifications 210

3.2.2.4.2.3 Prompt/Passage Development/Selection Process 211

3.2.2.4.2.4 Universal Design Principles 212

3.2.2.4.2.5 Reviews 212

3.2.2.4.2.6 Prompt/Passage Review Meetings 213

3.2.2.4.2.7 Bias Review 213

3.2.2.4.2.8 Face-to-Face Review 213

3.2.2.4.3 Form Development Process 213

3.2.2.4.3.1 Field Test Development 213

3.2.2.4.3.2 Operational Form Development 214

3.2.2.4.4 Ancillary Product Development 214

3.2.2.4.5 Art and Production 215

3.2.2.4.6 Accommodations 215

3.2.2.4.6.1 Braille 215

3.2.2.4.6.2 Large Print 216

3.2.2.4.7 Content/Form Management System 216

3.2.2.4.7.1 Introduction 216

3.2.2.4.7.2 Descriptive Information for Content/Form Management System 216

3.2.2.4.7.3 Psychometric Information for Content/Form Management System 217

3.2.2.4.7.4 Software/Hardware Concerns for Content/Form Management System 217

3.2.2.4.7.5 Optional Prompt/Passage Content Management System Tasks 218

3.2.2.4.8 Copyright Issues 218

3.2.2.4.9 Examples of Prompts/Passages 218

3.2.2.4.10 Examples of Page Layouts 218

3.2.2.4.11 Prompt/Passage Alignment to the 21st Century WV CSOs 219

3.2.2.4.12 Psychometric Research and Technical Services 219

3.2.2.4.12.1 Descriptive Statistics 219

3.2.2.4.12.2 Validity 220

3.2.2.4.12.3 Reliability 220

3.2.2.4.12.4 Scale Scores – Calibration Scaling and Equating Procedures 221

3.2.2.4.12.5 Statistical Software 221

3.2.2.4.12.6 Vertical Scaling 221

3.2.2.4.12.7 Standard Setting 222

3.2.2.4.12.8 Statistical Analyses for Special Populations and Other Studies 223

3.2.2.4.13 External Quality Control 224

3.2.2.4.14 Psychometric Support 224

3.2.2.4.15 Technical Reporting 224

3.2.2.4.15.1 Analyses Reports 224

3.2.2.4.15.2 Final Reports 225

3.2.2.4.16 Materials Production 226

3.2.2.4.16.1 Tests/Answer Documents 227

3.2.2.4.16.2 Test Examiner Manual 227

3.2.2.4.16.3 County Test Coordinator Manual 227

3.2.2.4.16.4 Other Ancillary Documents 227

3.2.2.4.16.5 Braille and Large Print Documents 228

3.2.2.4.16.6 Breach Forms 228

3.2.2.4.16.7 Responses to Writing Prompts 228

3.2.2.4.17 Materials Distribution/Retrieval 228

3.2.2.4.17.1 Scoring Materials 229

3.2.2.4.17.2 Pre-identification of Answer Documents 229

3.2.2.4.17.3 Optional Technology Systems 229

3.2.2.4.17.4 Pack and Distribute Materials 229

3.2.2.4.17.4.1 Packaging Specifications 230

3.2.2.4.17.4.2 Quantities 230

3.2.2.4.17.5 List of County Test Coordinators 230

3.2.2.4.17.6 Packing, Distributing and Receiving Materials Provisions 230

3.2.2.4.17.7 Missing Materials Report and Inventory 230

3.2.2.4.17.8 Store/Retrieve Paper Answer Documents 231

3.2.2.4.17.9 Disposition of Paper Materials 231

3.2.2.4.18 Processing and Scanning Verification 231

3.2.2.4.18.1 Processing Specifications 232

3.2.2.4.18.2 Verify Document Receipt 232

3.2.2.4.18.3 Scan Materials 232

3.2.2.4.18.4 Materials Edited 232

3.2.2.4.19 Scoring and Technology 232

3.2.2.4.19.1 Develop Specifications 234

3.2.2.4.19.2 Verify Scoring 234

3.2.2.4.19.3 Cheating Detection 234

3.2.2.4.19.4 Report Verification 235

3.2.2.4.20 Handscoring Essays 235

3.2.2.4.20.1 Produce Handscoring Specifications 235

3.2.2.4.20.2 Conduct Performance Scoring Operations 236

3.2.2.4.20.3 Conduct Rangefinder Review Meetings 236

3.2.2.4.20.4 Conduct Rangefinder Selection Meetings 236

3.2.2.4.20.5 Produce Scoring Materials 236

3.2.2.4.20.6 Handscoring Reports 237

3.2.2.4.20.7 Scoring Student Responses 237

3.2.2.4.20.8 Monitor and Maintain Handscoring Quality 238

3.2.2.4.20.9 Handscoring Personnel 238

3.2.2.4.20.10 Scoring Directors for Handscoring 239

3.2.2.4.20.11 Team Scoring Leaders 239

3.2.2.4.20.12 Recruit and Hire Readers 239

3.2.2.4.20.13 Training and Qualifying of Readers 239

3.2.2.4.20.14 Scoring Sites 240

3.2.2.4.20.15 Expedite Performance Scoring 240

3.2.2.4.20.16 Overall Scoring Quality Control 240

3.2.2.4.21 Online Writing/Technology Specifications 241

3.2.2.4.21.1 Security 242

3.2.2.4.21.2 Administration 243

3.3.2.4.22 FTP Site 243

3.2.2.4.23 Scoring/Reporting 243

3.2.2.4.24 Reporting 244

3.2.2.4.24.1 Reports Descriptions/Timelines 244

3.2.2.4.24.2 Develop Specifications 246

3.2.2.4.24.3 Report Development 246

3.2.2.4.24.4 Update Report Designs 247

3.2.2.4.24.5 Report Delivery 247

3.2.2.4.24.6 Report Phases 247

3.2.2.4.24.7 Electronic Records 248

3.2.2.4.24.8 Optional Reporting Services 248

3.2.2.4.24.9 Disaster Recovery Plan 248

3.2.2.4.25 Data Management 248

3.2.2.4.26 Final Delivery of Materials 249

3.2.2.4.27 Transition Plan 249

Table 34 Transition Plan 250

3.2.2.4.27.1 Transition Psychometric Issues 250

Table 35 West Virginia Materials Production Specifications Writing Assessment

Field Test 2008 251

Table 36 West Virginia Materials Production Specifications Writing Assessment

Operational Test 2008-2014 252



3.2.2.5 General Requirements for Algebra I Grades 7-12 Introduction 254

3.2.2.5.1 Item/Form Development 254

3.2.2.5.1.1 Test Construction Specifics 254

3.2.2.5.2 Blueprint/Item Specifications 255

Table 37 Test Design for Algebra I EOC Operational Forms 255

3.2.2.5.2.1 Item Specifications 256

3.2.2.5.3 Item Development/Selection Process 256

3.2.2.5.3.1 Universal Design Principles 257

3.2.2.5.4 Reviews 257

3.2.2.5.4.1 Content Review Meetings 257

3.2.2.5.4.2 Bias Review Meetings 258

3.2.2.5.4.3 Face-to-Face Review Meetings 258

3.2.2.5.5 Form Development Process 258

3.2.2.5.5.1 Field Test Development 258

3.2.2.5.5.2 Operational Form Development 259

3.2.2.5.6 Ancillary Product Development 259

3.2.2.5.6.1 Test Examiner’s Manual 260

3.2.2.5.6.2 County Test Coordinator’s Manual 260

3.2.2.5.7 Art and Production 260

3.2.2.5.8 Accommodations 261

3.2.2.5.8.1 Braille 261

3.2.2.5.8.2 Online Large Print Screens 261

3.2.2.5.9 Content/Form Management System 261

3.2.2.5.9.1 Introduction 261

3.2.2.5.9.2 Descriptive Information Content/Form Management System 262

3.2.2.5.9.3 Psychometric Information for Content/Form Management System 262

3.2.2.5.9.4 Software/Hardware Concerns for Content/Form Management System 263

3.2.2.5.9.5 Optional Content/Form Management System Tasks 263

3.2.2.5.9.6 Copyright Issues 264

3.2.2.5.9.7 Examples of Items 264

3.2.2.5.9.8 Examples of Screen Layouts 264

3.2.2.5.9.9 Validity 264

3.2.2.5.9.10 Technical Report Documentation 264

3.2.2.5.10 Online Assessment Technology 265

3.2.2.5.10.1 Technology Requirements 266

3.2.2.5.10.2 Security 267

3.2.2.5.10.3 Reporting 267

3.2.2.5.10.4 Disaster Recovery Plan 267

3.2.2.5.10.5 FTP Site 268

3.2.2.5.11 Reports Descriptions 268

Table 38 Algebra I Enrollment (Does Not Include a 5% Overage) 268

3.2.2.6 Online Formative Assessment Grades K—12 Introduction 271

3.2.2.6.1 Online Formative Assessment Overview 272

Table 39 West Virginia Public/Private/Parochial Enrollment with 5% Overage……….273

3.2.2.6.2 Online Formative Assessment System 274

3.2.2.6.3 Formative Assessment Item Bank Requirements 275

3.2.2.6.3.1 Formative Assessment Item Bank Authoring 277

3.2.2.6.4 Administration 278

3.2.2.6.5 Online Assessment Technology 278

3.2.2.6.5.1 Technology Requirements 280

3.2.2.6.5.2 Security 280

3.2.2.6.5.3 Reporting 281

3.2.2.6.5.4 Disaster Recovery Plan 281

3.2.2.6.5.5 FTP Site 281

3.2.2.6.6 Software/Hardware Security Concerns 281

3.2.2.6.7 Accommodations 281

3.2.2.6.7.1 Large Print 282

3.2.2.6.8 Optional Content/Form Management System Tasks 282

3.2.2.6.9 Copyright Issues 282

3.2.2.6.10 Psychometric Information 282

3.2.2.6.10.1 Validity 282

3.2.2.6.10.2 Reliability 283

3.2.2.6.10.3 Performance Levels 283

3.2.2.6.11 Scoring/Reporting 283

3.2.2.6.12 References 286



3.2.3 College Predictive/Entrance Exams 288

3.2.3.1 Vendor Requirements 288

3.2.3.1.1 Test Specifics 288

3.2.3.1.2 Agency Participation 288

3.2.3.1.3 Vendor Item Development Process 288

3.2.3.1.3.1 Universal Design Principles 289

3.2.3.1.4 Bias Reviews 289

3.2.3.1.5 Test Development Process 289

3.2.3.1.5.1 Field Test Results 289

3.2.3.1.5.2 Operational Form Development 289

3.2.3.1.6 Ancillary Product Development 290

3.2.3.1.7 Art and Production 290

3.2.3.1.8 Accommodations 290

3.2.3.1.8.1 Braille 290

3.2.3.1.8.2 Large Print 291

3.2.3.1.9 Psychometric Information 291

3.2.3.1.10 Copyright Issues 291

3.2.3.1.11 Examples of Tests 291

3.2.3.1.12 Examples of Page Layouts 291

3.2.3.2 Item Alignment to the CSOs 291

3.2.3.3 Psychometric Research and Technical Services Introduction 292

3.2.3.4 Psychometric Support 292

3.2.3.5 Analyses Reports 292

3.2.3.6 Materials Production 292

3.2.3.6.1 Test Booklets 293

3.2.3.6.1.1 Test Booklet Covers 293

3.2.3.6.2 Answer Documents 293

3.2.3.6.3 Test Examiner’s Manual 293

3.2.3.6.4 Interpretative Guide 294

3.2.3.6.5 Other Ancillary Documents 294

3.2.3.6.6 Braille and Large Print Documents 294

3.2.3.6.7 Breach Form 294

3.2.3.7 Materials Distribution/Retrieval 294

3.2.3.8 Pre-identification of Answer Documents 295

3.2.3.9 Optional Technology Systems 295

3.2.3.9.1 FTP Site 296

3.2.3.10 Pack and Distribute Materials 296

3.2.3.10.1 Packaging Specifications 296

3.2.3.10.2 Quantities 296

Table 40 Number of Public, Private, and Parochial Students Based on 2006-2007

Enrollment Figures for Grades 8, 10, 11 and 12 296

3.2.3.10.3 List of County Test Coordinators 297

3.2.3.10.4 Packing, Distributing, and Receiving Materials Provisions 297

3.2.3.10.5 Store/Retrieve Paper Answer Documents and Test Books 298

3.2.3.10.6 Disposition of Paper Material 298

3.2.3.11 Processing and Scanning Verification Introduction 299

3.2.3.11.1 Processing Specifications 299

3.2.3.11.2 Scan Materials 300

3.2.3.11.3 Materials Edited 300

3.2.3.12 Disaster Recovery Plan 300

3.2.3.13 Scoring and Technology Introduction 300

3.2.3.13.1 Develop Specifications 301

3.2.3.14 Cheating Detection 301

3.2.3.15 Scoring Writing Essays 301

3.2.3.15.1 Produce Scoring Specifications 301

3.2.3.15.2 Conduct Performance Scoring Operations 301

3.2.3.15.3 Produce Scoring Materials 302

3.2.3.15.4 Scoring Reports 302

3.2.3.15.5 Monitor and Maintain Scoring Quality 302

3.2.3.16 Scoring Personnel 302

3.2.3.16.1 Scoring Directors for Image-based Scoring 302

3.2.3.16.2 Recruit and Hire Readers 302

3.2.3.16.3 Training and Qualifying of Readers 302

3.2.3.16.4 Scoring Sites 302

3.2.3.16.4.1 Expedite Performance Scoring 303

3.2.3.16.5. Overall Scoring Quality Control 303

3.2.3.17 Reporting 303

3.2.3.17.1 Specifications 303

3.2.3.17.2 Report Development 303

3.2.3.17.3 Update Report Designs 304

3.2.3.17.4 Report Delivery 304

3.2.3.17.4.1 Report Phases 305

3.2.3.17.4.2 Electronic Records 306

3.2.3.17.4.3 Report CDs 306

3.2.3.17.5 Reports/Timelines 306

3.2.3.17.6 Reports Descriptions 306

3.2.3.18 College Reports 307

3.2.3.19 Data Management 307

3.2.3.20 Final Delivery of Materials 308

3.2.3.21 Test Administrators Training 308

3.2.3.22 State and National Meetings 308

3.2.3.23 Make-ups and Re-Tests for College Entrance Exam 309

3.2.3.24 Other 309

3.2.4 Program Management 311

3.2.4.1 Vendor Provisions 311

3.2.4.2 Contact and Communication Between the Vendor and Agency 312

3.2.4.2.1 Management Meetings and Agency Staff Travel 312

3.2.4.3 Communication Tools – Hardware/Software 313

3.2.4.4 Project Management Documentation 313

3.2.4.5 Communication Between the Vendor and Counties 314

3.2.4.6 Project Management 314

3.2.4.6.1 Vendor Staff 314

3.2.4.6.2 Program Management 315

3.2.4.6.3 Program Manager/Assistant Program Manager 315

3.2.4.6.4 Program Management Team 315

3.2.4.6.5 Other Key Staff Members 316

3.2.4.6.6 Other Professional Personnel 316

3.2.4.6.7 Other Team Personnel 316

3.2.4.6.8 Staffing Replacement 317

3.2.4.6.9 Overall Organizational Staffing 317

3.2.4.6.10 Staff Resource Allocation 317

3.2.4.6.11 Agency Responsibilities 317

3.2.4.7 Author’s Alterations 317

3.2.4.8 Ownership 318

3.2.4.9 Project Schedule 318

3.2.4.10 Deadline Adjustments 318

3.2.4.10.1 Vendor-Proposed Adjustments 318

3.2.4.10.2 Agency-Proposed Adjustments 318

3.2.4.10.3 Waiver 318

3.2.4.10.4 Other Remedies 318

3.2.4.10.5 Agency Approval 319

3.2.4.11 Vendor/Subcontractor Corporate Qualifications and Experience 319

3.2.4.11.1 Entity Qualifications 319

3.2.4.11.2 Organizational Capacity to Perform Work 319

3.2.4.11.3 Vendor/Subcontractor Staffing 320

3.2.4.12 Vitae 320

3.2.4.13 Team Chart(s) 321

3.2.4.14 References 321

3.2.4.14.1 Letters of Reference 321

3.2.4.14.2 Client References 321

3.2.4.15 Other Current/Pending Contracts 322

3.2.4.16 Errors/Problems 322

3.2.4.17 Other Information and Appendices 322

3.2.4.17.1 Sample Materials 322

3.2.4.17.2 Additional Vendor Provided Information 322



3.3 Special Terms and Conditions 324

3.3.1  Bid and Performance Bonds 324

3.3.2  Insurance Requirements 324

  1   2   3   4   5   6   7   8   9   ...   38


The database is protected by copyright ©sckool.org 2016
send message

    Main page