Monographs in Engineering Education Excellence University of South Carolina College of Engineering and Information Technology



Download 2.62 Mb.
Page1/35
Date05.08.2018
Size2.62 Mb.
  1   2   3   4   5   6   7   8   9   ...   35

Monographs in Engineering Education Excellence

University of South Carolina College of Engineering and Information Technology


Gateway Engineering Education Coalition

Edward Ernst, University of South Carolina, Monographs Editor



A Continuous Quality Improvement System: An On-going Assessment Process within the College of Engineering and Information Technology at U.S.C.

Susan D. Creighton

Edward W. Ernst

Joseph H. Gibbons

Charles W. Brice

Francis A. Gadala-Maria

Jed S. Lyons

Anthony Steve McAnally



University of South Carolina

Number 4, December 2000




Monographs in Engineering Education Excellence

Edward Ernst, University of South Carolina, Monographs Editor




A Continuous Quality Improvement System: An On-going Assessment Process within the College of Engineering and Information Technology at U.S.C.
By:

Susan D. Creighton

Edward W. Ernst

Joseph H. Gibbons

Charles W. Brice

Francis A. Gadala-Maria

Jed S. Lyons

Anthony Steve McAnally

Published by the College of Engineering and Information Technology, University of South Carolina, Columbia, SC 29208. Address editorial correspondence to Edward Ernst, 3A12 Swearingen Engineering Center, University of South Carolina, Columbia, SC 29208; (803) 777-9017; Ernst@engr.sc.edu.

Contents


Page

Preface 4



Background 6
College Assessment Infrastructure 8
College-Wide System 10
Assessment Plan 13
Assessment Methods 15
Quality Review Process 23
Program Assessment Structures and Processes 25

Mechanical Engineering Program 29

Chemical Engineering Program 35

Civil Engineering Program 42

Electrical Engineering Program 50

Computer Engineering Program 57

Appendices



A Assessment Plan 64

B Senior Survey 69

C Senior Survey reports (sample) 75

D Course Survey 118

E Course Survey reports (sample) 121

F Alumnae/Alumni Survey 128

G Alumnae/Alumni Survey reports (sample) 135

H Faculty/Staff Surveys 173

I Faculty/Staff Survey reports (sample) 181

J Entering Student Survey 196

K Entering Student Survey reports (sample) 201

L Performance Assessment Instrument 223

M Mid-Course Evaluation 230

N Education Outreach Survey 234

O Professional Communication Center Assessment Report 236

P Longitudinal Student Tracking Report (sample) 242

Q Bates House Project Report 251

R Template for Documenting Assessment Progress 260

Preface


Monographs in Engineering Education Excellence is a series of publications dealing with innovations in engineering education introduced at the University of South Carolina, with the support of the Gateway Engineering Education Coalition. The series seeks to make the information and ideas in the reports more accessible to engineering educators. It is hoped that other institutions will find the reports useful and adaptable to their own educational mission.

The Monographs in Engineering Education Excellence series includes a variety of genrestheses, dissertations, and technical reports, but all have the common objective of rethinking, reshaping, and revitalizing engineering education. This monograph, A Continuous Quality Improvement System: An Ongoing Assessment Process within the College of Engineering and Information Technology at U.S.C., discusses the college-wide assessment and CQI system developed to ensure the educational programs of the college are achieving the expectations held for them. The monograph presents examples and details regarding the tools, policies, processes, and procedures that have been developed and implemented in the college. These assessment/CQI efforts have evolved with support from the Gateway Engineering Education Coalition.

A broad agreement on the need for systemic educational reform exists within the engineering education community so those programs can provide the activities necessary to develop graduates who meet the new standards for the 21st century. The reform movement encourages more diversity in classroom practices that move instruction from a traditional lecture to structured activities reflecting what engineers do in the workplace. These initiatives promote changes in classroom practices to reflect the knowledge, skills, and abilities required by engineers to conceptualize, articulate, and implement a solution for engineering problems. The reform movement also advocates that engineering curricula incorporate a variety of teaching methods to involve students in active learning, design projects, technology use, and multidisciplinary teams. Outcomes-based assessments, in the form of design projects, portfolios, and model construction, enable faculty to link student competencies with the expectations of the workplace.

Believing in the need for change and recognizing that engineering is part of the growing national trend toward increased accountability, many accrediting organizations as well as national and state funding agencies, such as the National Science Foundation, have taken leadership roles in defining new parameters for engineering education. The paradigm shift is clearly evident in the new criteria adopted by the Accreditation Board for Engineering and Technology (ABET) which promote the use of outcomes assessment as the measuring tool for institutional and program evaluation. The stated goals of the ABET accreditation include: (1) providing graduates of accredited programs who are adequately prepared to enter the engineering profession; (2) stimulating the improvement of engineering education; and (3) encouraging new and innovative approaches to engineering education.

To achieve these objectives, the ABET Engineering Criteria 2000 stipulate that individual programs must have and have published educational objectives consistent with the mission of their institution. Programs must evaluate the success of students in meeting program objectives using appropriate assessment methodologies. The ABET criteria also require engineering programs to include a continuous quality improvement process. In this model, the program evaluation process documents progress towards achievement of objectives established by the engineering program and uses this information to improve the program.

Moreover, the criteria require that programs demonstrate student outcomes of such complex skills as the ability to design and conduct experiments, as well as to analyze and interpret data, the ability to design a system, component, or process to meet desired needs and an ability to communicate effectively. Types of evidence advocated by ABET to document these student outcomes include portfolios, design projects, nationally normed subject content examinations, and alumnae/alumni and employer surveys.

Criterion 2 of the ABET Engineering Criteria 2000 mandates a system that continually evaluates the programs to determine if program objectives are met and if they meet the needs of the program’s constituencies. The college developed and implemented a college-wide infrastructure with supporting policies, procedures, personnel and assessment tools to ensure the permanency and effective operation of the system.

The college-wide assessment system is linked with the continuous quality improvement processes initiated within each USC engineering program - Chemical, Civil, Computer, Electrical and Mechanical engineering. Together, the college-wide assessment processes and the program assessment processes comprise the USC COEIT Continuous Quality Improvement System.

The college-wide infrastructure provides the coordination and collaboration efforts needed to facilitate: (1) continuous cycles of program improvement; (2) the attainment of college goals and objectives; and (3) the achievement of state-level and accreditation agency performance indicators. The structure supports the personnel and resources necessary to maintain the flow of data, information and evaluation results through the system. It also serves as the focus for the triangulation and synthesis of data from different constituencies and various reports.

Background

Numerous reports over the past ten years have outlined the attributes that engineering graduates need to possess in the 21st century workplace [1]. The engineering education culture is shifting from one emphasizing individual specialization, compartmentalization of knowledge and a research-based faculty reward structure to one that values integration and specialization, teamwork, educational research and innovation. Institutions of higher education now focus on student outcomes or performance-based models of instruction that strive to measure what students have learned and what they can do [2]. Outcomes assessment examines the results of the education process by asking to what extent students have accomplished the objectives of their discipline.

There is broad agreement of the need for systemic educational reform within the engineering community so those programs can provide the activities necessary to develop graduates who meet the new standards for the next century. The reform movement encourages more diversity in classroom practices that move instruction from a traditional lecture to structured activities reflecting what engineers do in the workplace. These initiatives promote changes in classroom practices to reflect the knowledge, skills, and abilities required by engineers to conceptualize, articulate, and implement a solution for engineering problems. The reform movement also advocates that engineering curricula incorporate a variety of teaching methods to involve students in active learning, design projects, technology use, and multidisciplinary teams. Outcomes-based assessments, in the form of design projects, portfolios, and model construction, enable faculty to directly link student competencies with the expectations of the workplace.

Believing in the need for change and recognizing that engineering is part of the growing national trend toward increased accountability, many accrediting organizations as well as national and state funding agencies, such as the National Science Foundation, have taken leadership roles in defining new parameters for engineering education. The paradigm shift is clearly evident in the new criteria adopted by the Accreditation Board for Engineering and Technology (ABET) which promote the use of outcomes assessment as the measuring tool for institutional and program evaluation. The stated goals of the ABET accreditation include: (1) providing graduates of accredited programs who are adequately prepared to enter the engineering profession; (2) stimulating the improvement of engineering education; and (3) encouraging new and innovative approaches to engineering education [3].

To achieve these objectives, the ABET Engineering Criteria 2000 stipulates that individual programs must have published educational objectives consistent with the mission of their institution. Programs must evaluate the success of students in meeting program objectives using appropriate assessment methodologies. The ABET criteria also require engineering programs to include a continuous quality improvement process. In this model, the program evaluation process provides documentation of progress toward achievement of objectives established by the engineering program and uses this information to improve the program.

In addition, the criteria require that programs demonstrate student outcomes of such complex skills as the ability to design and conduct experiments, as well as to analyze and interpret data, the ability to design a system, component, or process to meet desired needs and an ability to communicate effectively. Types of evidence advocated by ABET to document these student outcomes can include portfolios, design projects, nationally normed subject content examinations, focus groups, and surveys of alumnae/alumni, students and/or employers.



College Assessment Infrastructure

As engineering classroom practices change, the evaluation of student development and program effectiveness must align with the new ABET emphases. Criterion 2 of the Criteria 2000 specifies that programs must have published educational objectives that are consistent with the mission of the institution. It also mandates a system that continually evaluates to determine if program objectives are met and if they meet the needs of the program’s constituencies. To this end, the University of South Carolina College of Engineering and Information Technology (COEIT) developed and implemented a college-wide infrastructure with supporting policies procedures, personnel and assessment tools to ensure the permanency and effective operation of the system.



The college-wide assessment system is linked with the continuous quality improvement processes initiated within each USC engineering program - Chemical, Civil, Computer, Electrical and Mechanical engineering. Together, the College-wide assessment processes and the program assessment processes comprise the USC COEIT Continuous Quality Improvement System. Both parts of this system are integrated within the College Strategic Plan. As seen in Figure 1, this plan connects the College to its institution through the statement of University of South Carolina’s vision, mission and goals.



Figure 1. Overview of COEIT Continuous Quality Improvement System

The purpose of the continuous quality assessment system is to continually assess the needs of the program’s various constituencies to ensure that the programs are achieving expectations as described by the objectives and to evaluate how effectively each program and the College have moved toward achieving stated mission and goals. Assessment processes show faculty, staff, administrators and others where improvements seem to be appropriate and guide the implementation of change within each program and college-wide service areas. Changes are monitored and re-evaluated to determine what improvement has been realized. Thus, the system is an ongoing evaluation of the effectiveness of the College and its programs.

The following sections will discuss both the College-wide system and the program systems. Examples and details will be given regarding the tools, policies, processes, and procedures that have been developed and implemented at USC COEIT to ensure the institutionalization of the CQI System.

Note. This monograph is a snapshot of the status at the end of the spring Semester, 2000. The CQI processes are relatively new and continue to change.

College-wide System

The College-wide infrastructure provides the coordination and collaboration efforts needed to facilitate: (1) continuous cycles of program improvement; (2) the attainment of college goals and objectives; and (3) the achievement of state-level and accreditation agency performance indicators. The structure supports the personnel and resources necessary to maintain the flow of data, information and evaluation results through the system. It also serves as the focus for the triangulation and synthesis of data from different constituencies and various reports.



The comprehensive character of the college-wide assessment structure is evident in the following diagram.




Figure 2. College-Wide Assessment Infrastructure

The diagram shows the integration of state and institutional parameters within the system. It also highlights the linking of college assessment processes to its departmental programs. A more comprehensive view of the departmental assessment processes within this continuous loop system is discussed in a later section.

The personnel and processes of the college-wide assessment infrastructure, however, are the focus of this diagram. The College-wide infrastructure consists of several formal, key components: College Executive Committee, Center for Engineering Education Excellence, the Center for Engineering Education Excellence Team, Assessment Director, Departmental Assessment/Education Committees and its various constituencies.

A brief overview will outline the responsibilities of each component and provide insight into how these personnel and committees interact to produce a continuous quality improvement process.



Executive Committee

The Executive Committee is composed of the Dean, Associate Deans, Departmental Chairpersons and the Center for Engineering Education Excellence Director. This committee meets at two-week intervals and provides oversight and decision-making duties for the College.



Center for Engineering Education Excellence

The Center for Engineering Education Excellence is an interdisciplinary organization of individuals who collaborate in the effort to promote self-study, innovation and reform within the College. The staff and support personnel involved in the Center include: the Director for the Center; a Program Coordinator, the Assessment Director, the Director of the Professional Communications Center and the Ethics Coordinator.

The mission of the Center includes all the major parts of engineering education: undergraduate, graduate, and research; and promotes meaningful integration of engineering education. The educational goal of the Center is to graduate students that understand the technology content of engineering as well as the social, political, ethical, environmental and economic context.

The objectives for the Center have both an internal and an external thrust. These objectives include:



  • Development of students as engineering professionals with the motivation, capability and knowledge

base for career-long learning

  • Emphasize effective teaching/learning strategies for all types of students

  • Promote effective and (time) efficient student/faculty interaction

  • Enhance the continuous quality improvement process (CQI) within the College

  • Serve the engineering education community by encouraging innovation and reform

  • Increase the visibility of USC to the engineering education community

  • Provide a channel for learning about innovation in engineering education at other schools

Center for Engineering Education Excellence Management Team

The Center for Engineering Education Excellence Team provides the opportunity for collaboration among the programs, discussion of issues, planning activities, and making recommendations for college-wide initiatives. The committee consists of a Chairperson (Director of the Center), the Assessment Director, the Associate Dean for Academic Affairs, the Director for the Professional Communications Center, the Ethics Coordinator, and one faculty representative from the Chemical, Civil, Computer, Electrical and Mechanical programs. The biweekly committee meetings serve as one focal point for the distribution and discussion of report findings and information. Committee members then share this information with the appropriate committees within their individual departments.

The members of the Center for Engineering Education Excellence Team have been the primary personnel involved with the initial organization and maintenance of the assessment structure. Meeting on a weekly basis, the team addressed a range of issues relating to the implementation of a continuous improvement program. Substantive tasks accomplished by the Committee include:


  • restatement of the College’s mission

  • articulation of an assessment process within each program

  • development of educational objectives for each program

  • development of objectives for each course within each program

  • determination of some assessment methods and metrics to measure the objectives and outcomes

  • development of a faculty workload policy

  • discussion regarding survey results (Senior Exit Survey, Course Survey, etc.)

  • review and feedback of each college-wide survey or assessment technique

  • wrote self-study reports for the ABET accreditation review

  • participated in the ABET accreditation review

Program Assessment Committees

The Program Assessment Committees include three to five faculty members within each program and serve as the focus for problem solving, innovation and program change. Each program has articulated an assessment structure and process to collect and/or review data and information that is related to their student outcomes and course objectives. In general, each department designated responsibility for addressing assessment data and/or topics to one or more committees within their attachments. A more extensive discussion of the continuous quality improvement processes for the degree programs follows in later sections.


Director of Assessment

The Director of Assessment position was created to develop and implement the overall college-wide assessment infrastructure, processes, and procedures for maintaining a continuous quality improvement program, and, to provide technical support to the faculty implementing assessment processes in each degree program. Having a full-time person to direct and support assessment activities was an important step because it increased the flow of information among faculty and staff across disciplines resulting in an increased ownership of student learning outcomes and a heightened sense of responsibility towards its graduates. The sharing of ideas, information and evaluation results enhanced communication between the administration and the faculty and staff members.



Assessment Plan

The Director of Assessment developed a three-year plan to guide the implementation and evaluation of the continuous quality improvement process and to establish timeframes, action strategies and a budget for the system. The assessment program plan set objectives, outcomes, criteria and a timeframe that established the framework for a continuous quality review/improvement system. The goals of this program are fourfold:



  1. to present conclusions regarding the overall outcomes of the student’s academic and extracurricular engineering performance for use in decision making by faculty, program chairs and administration;

  2. to present results about programs, activities, etc. in order to improve the programs;

  3. to enhance understanding and appreciation of formative and summative evaluation; and

  4. to contribute to the general body of knowledge with regard to evaluation of undergraduate engineering programs.

An example from this plan is given in the following section. Objective 1 provides for the overall assessment system for the College. See Appendix A for the complete Assessment Plan.

Assessment Plan
Program Objectives and Strategies



Objective 1:
Develop and implement an assessment program that provides processes and procedures for the continuous evaluation of student performance and satisfaction, faculty performance and satisfaction and stakeholder input into the educational system.

Action Strategies & Timeframes:


  1. Monitor the processes and procedures developed and implemented to evaluate assessment data provided to each program and the executive committee.

(4/00; 4/01;4/02;4/03;4/04)


  1. On an annual basis, each department will review and make recommendations for improvement based on assessment data collected to address each program outcome as part of the continuous quality review program. (Center for Engineering Education Excellence Team) (6/00;6/01;6/02;6/03;6/04)

3. The Director of Assessment will prepare the annual Quality Review Program Report indicating the extent to which the action plans were implemented and achieved by each department, the feasibility of the time frames and recommendation for improving the process. (10/00; 10/01;10/02;10/03;10/04)


Outcomes:


  1. Outlining each major step in the assessment process that will occur within the program, each program will submit written procedures to be reviewed by the Dean.

  2. Each program will submit written procedures.

  3. On an annual basis, each department will provide a written summary report of findings (outcomes), results, actions taken, consequences, and recommendations verifying the assessment process has completed the annual cycle and specifying problems and solutions.

  4. The Director of Assessment will summarize results and recommendations of the Center for Engineering Education Excellence Team; then prepare a synopsis of the annual review indicating assessment measures analyzed, outcomes, recommendations, changes implemented, and the evaluation results of the changes.

  5. The Executive Committee will discuss and prioritize action strategies recommended as a result of the annual program review.


Resources:
The Director of Assessment position

An educational research graduate assistant

A work-study student assistant

The assessment plan provides a comprehensive outline of all of the tasks related to the Director of Assessment position. In addition, this plan also details the College instruments to be implemented and the methodology to be used to ensure that ongoing assessment and evaluation is undertaken by the degree programs. Use of the Strategic Plan for the College of Engineering and Information Technology is one way in which the degree program assessment processes are continually monitored, revised and evaluated. Departmental and college objectives and outcomes are modified annually to address new priorities or pursuits. The annual Quality Review Program Report is incorporated within the Strategic Plan.



Assessment Methods

The Director of Assessment has also identified and developed college-wide assessment tools for use in the continuous quality improvement system. A number of instruments, processes and procedures were developed and implemented to collect data that can be used to evaluate the effectiveness of the USC College of Engineering and Information Technology and its programs as well as student learning and growth. In addition, the Director of Assessment provided a Student Longitudinal Tracking System, coordinated the implementation of Employer Focus Groups, interviewed students and faculty members, assisted instructors with the evaluation of teaching/learning objectives for specific courses, and developed evaluation measures for examining the impact of the Professional Communications Center. A few of the important college-wide assessment instruments developed and utilized thus far in the assessment process are discussed in the following sections.



Senior Survey

Students graduating from the College of Engineering and Information Technology complete a survey requesting information about their undergraduate college experience and their judgment regarding specific engineering skills and abilities. The four-page survey obtains information in the following areas:



  1. overall ratings of students’ engineering education

  2. life-long learning indicators

  3. assessment of specific college services

  4. opportunity for students to make recommendations

  5. evaluation of ABET skills and competencies

  6. useful experiences

  7. extracurricular activities

  8. plans for graduate education

  9. employment information

  10. demographic information including transfer status

A copy of the survey is found in Appendix B. A Graduate Placement Sheet also accompanies the distribution of the survey; this assessment form requests an address for future mailings and employment and/or graduate school information. Students are given separate envelopes to return the Placement Sheet so that their anonymity will be maintained if they choose.



Administrative Procedures


Several methodologies have been utilized since the 1998 Spring Semester to administer the survey and the data sheet to graduating seniors. During the first three semesters, the College initiated a procedure that featured the use of graduating seniors from each program to distribute and collect the surveys from students in their program. The use of paid student assistants encouraged participation and resulted in a return rate of approximately 80 percent. As a result of this more personalized approach, seniors began to learn of the importance of this type of information to the College. During recent semesters, the College has experimented using other distribution and retrieval methodologies. The one that appears to be the most successful in producing the highest return rate and quality responses is having the instrument administered during a particular course in each program. The Chemical, Civil, Computer, Electrical and Mechanical programs have a senior-level course comprised of graduating seniors. Given at the end of the semester, this procedure captures an even greater percentage of the graduating seniors and assures a more uniform administration of the assessment instruments.

Reporting


The Director of Assessment prepares a tabular listing of responses giving frequencies and percentages for the total results and the breakdowns for each degree program. An additional summary report giving an analysis of the overall results and a synopsis of program differences, if any accompanies the listing of results. An example of each report is given in Appendix C.


Course Survey


The Course Survey assessment instrument is administered to students enrolled in all undergraduate and graduate courses taught within the College each semester. Administration of the form is required for all courses, including APOGEE (long distance education/continuing education) and other graduate courses, enrolling five or more students.
The first seven items on the survey are those mandated by the state legislature. The wording and the options of these seven items are reproduced as requested by the state law. On a regular basis, the College scores on these items are reported to the Office of Institutional Planning and Assessment; data are then forwarded to the Commission on Higher Education. Other items on the survey were developed and approved by the Center for Engineering Education Excellence Team.
The Course Survey was administered for the first time at the end of the 1997 Fall Semester and has been revised several times to accommodate changes within the College and to improve the quality of the survey items. The revised survey is a two-sided Scantron sheet having four sections. Students provide course and instructor data in the first section. The second includes 23 items structured in a Likert-type format. Alternatives for most of the items follow a 5-point scale ranging from “strongly disagree” to “strongly agree” with the midpoint as “neutral” response. Two items use “very poor” to “excellent” response patterns and one item includes a 4-point scale with a “very dissatisfied” to “very satisfied” response pattern. The third area provides space for instructors to add up to 12 additional questions. The last section contains three short answer questions providing students with the opportunity to make their own observations and comments regarding the strengths and weaknesses of the course. A copy of this survey is given in Appendix D.

Administrative Procedures

Each faculty member receives packets that include course surveys and student and faculty instructions for survey completion. Memos to the students and faculty outline coding instructions for adding the instructor identification, course and section number to the scanning process as well as survey dissemination, collection and retrieval information. Surveys are received in the Student Services Office where they are sorted, coded, counted, aligned and sent to Computer Services for scanning. Student data are analyzed and reported using a database and program written with the SAS (Statistical Analysis System) statistical software.


Reporting


Each semester, a tabular report listing the frequencies, percentages, means and standard deviations for each item alternative is generated for each faculty member. In addition to listing the faculty member’s total for each section, the report lists the departmental and college totals. The Director of Assessment also prepares a brief summary of the overall college results. Both reports are distributed to all College instructors. A more comprehensive report is prepared for the Executive Committee and members of the Center for Engineering Education Excellence Team. This report contains the frequencies, percentages, means and standard deviations for each item alternative for each program and the college totals. A copy of each type of report is located in Appendix E.

Alumnae/Alumni Survey

During the 1998 fall semester, the College of Engineering developed an Alumnae/Alumni Survey to obtain information from graduates who have been attending school or working for the past three years. The survey asks alumnae/alumni to evaluate several aspects of their undergraduate program and their present career position. The five-page instrument obtains information regarding the following topics:


Employment information

Satisfaction with career, salary, etc.

Continuing education

Rating of undergraduate experience

Rating competency level for particular skills

Rating importance of particular skills

Positive aspects of engineering program

Influential professors to professional development

Recommendations for improvement of educational experience

Professional development

Demographic information

A copy of the Alumnae/Alumni Survey (for graduates after three years) is included in Appendix F.



Administrative Procedures


The Assessment Director used the USC database of records to obtain student addresses for each mailing of the survey. The Alumnae/Alumni Survey is administered once a year to students who graduated three years prior to that date; this schedule was chosen because it allows graduates an average time period to complete a graduate degree or to become established in the workplace. The first mailing of this survey, to students who graduated in 1995 was completed during March 1999; approximately 22 percent of the surveys were returned for an insufficient or incorrect address. A second mailing, using alternative addresses if appropriate was completed during the first week of May 1999. The second administration of the Alumnae/Alumni Survey took place in November 1999 with a follow-up mailed in March 2000; surveys were mailed to 1996 graduates. Inaccurate addresses continued to be a problem in reaching COEIT alumnae/alumni. The third administration of the Alumnae/Alumni Survey for 1997 graduates was completed during July 2000. Alumnae/alumni survey data has been input and analyzed using SAS software.

Reporting


The Director of Assessment prepares a tabular listing of responses giving frequencies and percentages for the total results and the breakdowns for each program. An additional summary report giving an analysis of the overall results and a synopsis of program differences, if any, accompanies the listing of results. An example of each report is given in Appendix G. Copies of each report are mailed to each Executive Committee member and each Center for Engineering Education Excellence Team member. Additional personnel receiving reports include a representative from Development, Career Services and Student Services departments.

Faculty and/or Staff Survey

An initial Faculty and Staff Survey was administered during May of the 1999 Spring Semester addressing the following areas: (1) College goals and planning; (2) College-industry interaction; (3) College administration/leadership and communication; (4) College-wide services; (5) funding priorities (6) awareness of programs at aspirant institutions.


In April 2000, an alternative Faculty Survey was administered within the college to capture data similar to information requested from seniors and alumnae/alumni. This revised faculty survey elicited responses to questions concerning:

(1) the amount of experience students received on 21 skills



  1. the level of competency achieved by USC engineering students on 21 skills

  2. the extent to which reform learning/teaching strategies are incorporated within the classroom

  3. the level of student input for course improvement

  4. the improvement of the engineering education experience

  5. the use of different assessment tools within a course

  6. the professional development activities for faculty

A copy of each survey is located in Appendix H.




Administrative Procedures

Faculty surveys were mailed to each full-time faculty member within the College of Engineering and Information Technology. A cover letter, containing instructions for the return of the survey and an explanation of the importance of the requested information, and a labeled return envelope was provided within the survey packet. At the end of two weeks, an email was sent to all professors reminding them to complete and return the survey as soon as possible; an electronic copy of the survey was attached to the email.


Reporting


A tabular report listing the frequencies and percentages for each item alternative is generated for each program as well as college totals. The Director of Assessment also prepares a brief summary of the overall college results. Both reports are distributed to the Executive Committee and the Center for Engineering Education Excellence team members. A copy of each type of report is located in Appendix I.


Entering Student Questionnaire

An Entering Student Questionnaire was developed to provide specific information for administration personnel involved with student marketing and recruitment. The primary emphasis of this survey was determining why students chose to come to USC, to what other colleges they applied and the reasons that were important in their decision to attend the College of Engineering. Students are also asked to provide information about their academic background in math, chemistry, physics and writing. The survey also captures information about computer ownership, usage and training. A copy of this survey is found in Appendix J.



Administrative Procedures

Entering Student Questionnaires are administered once per year in the fall semester. These surveys are distributed to each faculty member teaching one of the freshmen engineering courses. Surveys are administered and collected by these instructors. Emails are sent instructors alerting them in advance that surveys are planned and as reminders when they should be returned to the Assessment Office. Data is entered and analyzed using SAS software.




Reporting


A tabular report is prepared that lists frequencies and percentages where appropriate and provides student responses to open-ended questions. The Director of Assessment also writes a summary report that analyzes and summarizes significant trends, themes and findings from the student response data. Reports are distributed to the Executive Committee, Student Services, the Development Officer and the Center for Engineering Education Excellence team members.


Performance Assessment Instrument for an Oral Presentation
A number of other assessment instruments have been developed for use by faculty members within the classroom to evaluate specific instructional objectives. A performance assessment handout listing course task expectations and an evaluation rubric for use with a senior level course using oral presentations is the first of several instruments to be developed and implemented during the 1998 Spring Semester. A copy of the handout is given in Appendix L.


Midterm Evaluation

A copy of a midterm evaluation survey is found in Appendix M. This form was developed for use in the Electrical and Computer Engineering sections to provide immediate feedback to the instructors regarding student perceptions of their progress and the overall effectiveness of the faculty member in achieving course objectives.




Educational Outreach

A survey designed to elicit information about ways to evaluate effectiveness and to improve the presentations was developed for use with the “E2 – Everyday Engineering” program. This is a school outreach effort that targets elementary, middle school and high school students. The Coordinator of this program creates and presents science-based learning activities in South Carolina area classrooms. A copy of this survey is found in Appendix N.



Professional Communications Center – Data Base and Evaluation of Impact
The Director of Assessment, in collaboration with the Director of the Professional Communications Center, planned several qualitative and quantitative methodologies to assess the impact of the writing center upon the students and faculty within the College of Engineering. A computer database and computer programs have been developed and implemented to obtain a more accurate reflection of the student/faculty consultations during each semester. Reports are generated each semester and at the end of the year; the tabular report for 1999 is found in Appendix O. This data collection effort examines the number of contacts occurring within the Center and individual classrooms involving PCC personnel. The data input also indicates the types of writing issues for which students and faculty seek assistance and the amount of time personnel spend with clients.


Student Longitudinal Tracking System

In collaboration with the University’s Institutional Planning and Assessment Office, the College of Engineering assisted with the design and implementation of a Longitudinal Student Tracking System that incorporates all of the necessary elements to study student trends from admission through graduation and beyond. The goal of this system is the availability of a college-wide mechanism that will provide data for faculty and administrators to enable them to continuously monitor and improve the quality of their programs.


To initiate the creation of the Longitudinal Student Tracking System, the College of Engineering developed a set of research questions and companion tables to specify the variables requested in the database and to show how the relationships among these variables might be displayed. A total of 36 research questions were enumerated, and, some of these are listed below.


  1. How many students were enrolled in each cohort (1990-91, 1991-92, 1992-93, 1993-94, 1994-95) for each of the following subgroups: total engineering students, first-time freshmen, and transfer students showing ethnicity and gender for each subgroup?




  1. How many students in each cohort graduated as of June 1998 showing distributions for each of the following subgroups: total students, first-time freshmen, and transfer students with breakdowns by ethnicity and gender for each subgroup?




  1. How many students in each cohort graduated in Engineering as of June 1998 showing distributions for each of the following subgroups: total students, first-time freshmen, and transfer students with breakdowns by ethnicity and gender for each subgroup?




  1. What are the average cumulative GPA’s of graduates within each cohort who received an Engineering degree showing the distributions for the following subgroups: total engineering students enrolled, first-time freshmen, transfer students with breakdowns by ethnicity and gender?

Using the College of Engineering research request as a guide, Planning and Assessment personnel downloaded student data from various USC mainframe systems to compile the Longitudinal Student Tracking component. This new database includes student data from the 1990-91 cohort to the 1997-98 cohort and incorporates 677 variables of interest. Variables can be grouped into the following six categories: admissions data (SAT scores, rank, entry status, etc.); demographic information (gender, ethnicity, etc.); academic performance indicators (grades in courses, GPA, etc.); graduation statistics; retention; and withdrawal rates. A copy of a report developed using some initial longitudinal student tracking data is located in Appendix P.


Evaluation of The Bates House Living-Learning Community

During the 1999 fall semester, freshmen students in The College of Engineering and Information Technology were offered the opportunity to participate in a unique Living-Learning Community program developed in collaboration with the USC Housing Department. The Engineering Community in Bates House is an on-campus residential community designed to enrich the educational environment for first-year engineering students. Development of this concept was based on research documenting the benefits of students living in learning environments that foster student-faculty interaction and student peer relationships strengthened by involvement with each other both in and out of the classroom.


More specifically, goals of the Engineering Community in Bates House are:
1) To increase the retention rate of these freshmen by creating a learning environment that maximizes their potential for success

  1. To incorporate active learning strategies and increased academic support to increase academic performance indicators such as the student’s grade point average (GPA);

  2. To develop professional attitudes and to emphasize experiential learning by encouraging student involvement in the community and the professional engineering organizations;

  3. To develop and implement the use of new technologies, such as laptop computers, that can be applied in the classroom to enhance education program delivery;

  4. To provide early design and teamwork experience to enhance student motivation and learning and to develop leadership, communication and problem solving skills.

The increases in retention and academic performance are primarily long-term research questions. The Bates House project students will be tracked during their subsequent years at USC collecting course grades and GPA data each semester. Retention figures for this group of students will be tabulated with overall results available at the end of the first, second and fourth years of the project.


A group of engineering students with similar academic backgrounds will be randomly selected for use as a control group to provide a criterion for judgment of program success. Retention rates, course grades and GPA data will be collected for this group of students each semester from 1999-2000 through the 2002-2003 academic years. Control and experimental groups will be compared to determine if the additional academic support and activities given the Bates House students yields improved performance and retention within the College. A summary of the initial results of the project is located in Appendix Q.


Quality Review Process
Feedback from the departments to the ABET/Gateway Committee concerning improvements undertaken within programs as a result of evaluation information is a key feature of the continuous quality assessment loop within the College. Although college-wide efforts provide specific pieces of student and faculty feedback, departments are also responsible for determining the additional assessment activities needed to evaluate their individual objectives. Within each Department, the survey results and reports are analyzed and discussed within the formal assessment structure and procedures adopted for program improvement. Each departmental committee reports the changes, modifications, and/or strategies they expect to follow to accentuate positive findings and provide corrective measures for the areas in need of attention.

The review process is a key component in linking the College-wide System with the Program Assessment Systems. It is also the means for initiating modifications and the framework for reporting on those changes and the subsequent results. The following figure highlights the committees and procedures utilized within the COEIT CQI System.






Figure 3. CQI Review Process

The Director of Assessment prepares tabular and summary reports for each assessment tool utilized within the College-wide System. As indicated above, all reports are generated and distributed to all the College Executive committee and the Center for Engineering Excellence Team. Findings are discussed at meetings of both of these committees.


The Program Chairpersons provide each faculty member with an electronic or hard copy of the summary and the tabular report of the results. Within each program, the results are analyzed and discussed within the formal assessment structure adopted for program improvement. Each program committee makes recommendations and initiates changes within the curriculum. As part of the strategic planning function each year, the programs include a report explaining the changes, modifications, and/or strategies they followed to accentuate positive findings and provide corrective measures for the areas in need of attention. Members of the Center for Engineering Education Excellence Team report and discuss these conclusions at committee meetings throughout the year. The same procedures are followed for the findings from each survey administration.
Program Assessment Structures and Processes

The assessment and continuous quality improvement processes implemented within the COEIT are both college-wide and departmentally focused. The program assessment systems are driven by the college-wide infrastructure; it provides the foundation necessary to generate and disseminate findings and reports for the College. More important, this infrastructure generates the coordination and collaboration efforts needed to facilitate program improvement.

The departments have responsibility for the educational programs; thus, implementation of the assessment processes is focused on the departments and the education programs. The departmental systems are comprised of on-going, institutionalized processes with the elements repeated at regular intervals to assure fresh assessment data and appropriate improvement plans.

In preparation for the development of the individual program assessment plans, the Center for Engineering Excellence Team and the Executive Committee participated in the review and modification of the statements specifying the College vision, mission, goals and objectives. The document adopted by the College on November 27, 1998 is included in the following paragraphs.



Vision Statement

The College of Engineering and Information Technology will be a national model for innovation and responsiveness in addressing the engineering education, economic development and lifelong learning needs of the state.



Mission Statement

The mission of the College of Engineering and Information Technology is to serve the engineering and technology needs of South Carolina through our programs of education, research, and outreach.



Goals and Objectives of the College

1. Meet the educational needs of South Carolina industry, our students and the engineering profession.

2. Support the economic development of our state and create new opportunities.


  1. Be recognized as a learning community of students, faculty, and staff that develops student motivation and capability for learning that enhances their careers and lives.

  2. Provide an environment that encourages individual intellectual curiosity and freedom and motivates students to meet high academic and ethical standards.

  3. Be recognized for research and scholarship and assist the university in its aspiration to become an AAU institution.

  4. Develop a supportive climate that attracts and supports a diverse group of faculty, staff, and students.

  5. Be recognized as a college committed to becoming better and more productive and to continuous improvement in its education, research, and outreach mission.

Training and Preparation

During the development process, faculty members from each program also attended a workshop conducted by Jack McGourty on October 9, 1998 to assist faculty members in the development of objectives and outcomes as well as planning strategies and actions needed to implement their program objectives. Templates of each step in the assessment process were distributed to attendees who worked in groups to practice writing objectives and outcomes. In addition to the workbooks provided by the Gateway Coalition, members of the Center for Engineering Excellence Team also received the booklet entitled “Stepping Ahead: An Assessment Plan Development Guide” written by Gloria M. Rogers and Jean K. Sando with funding from the National Science Foundation and the Foundation Coalition. Examples of the template used by the faculty members are located in Appendix R. The Assessment Director provided each department with guidelines to assist in the implementation of their individual systems; programs utilized the following outline in their preparations. This document is included in the following section.



Recommended Procedures for Articulating and Documenting Assessment Processes

August 1998




Download 2.62 Mb.

Share with your friends:
  1   2   3   4   5   6   7   8   9   ...   35




The database is protected by copyright ©sckool.org 2020
send message

    Main page