Review of the computer science program



Download 3,66 Mb.
Page1/5
Date conversion12.03.2018
Size3,66 Mb.
  1   2   3   4   5
ABET

Computing Accreditation Commission



SELF-STUDY

QUESTIONNAIRE FOR REVIEW

of the

COMPUTER SCIENCE PROGRAM
submitted by



New Mexico Tech

Institution




Jun 29, 2007

Date



to the

Computing Accreditation Commission


Primary contact:

Peter F. Gerity




Telephone number: ______________________ FAX number: __________________

(505) 835-5227

FAX Number:

(505) 835-5649




Electronic mail:

pgerity@admin.nmt.edu


ABET
Computing Accreditation Commission

111 Market Place, Suite 1050

Baltimore, Maryland 21202-4012

Phone: 410-347-7700

Fax: 410-625-2238

E-mail: cac@abet.org

www: http://www.abet.org/
Table of Contents

Introduction
The Criteria for Accrediting Computer Science Programs are divided into seven major Categories, each Criterion containing a statement of Intent and Standards. An intent statement provides the underlying principles associated with a Criterion. In order for a program to be accredited, it must meet the intent statement of every Criterion.
Standards provide descriptions of how a program can minimally meet the statements of intent. The word “must” is used within each standard to convey the expectation that the condition of the standard will be satisfied in all cases. For a program to meet the intent of a Criterion, it must either satisfy all the standards associated with that Criterion or demonstrate an alternate approach to achieving the intent of the Criterion.
For each of the following seven sections, corresponding to each of the seven Categories of the Criteria, answer all of the questions associated with the standards. If one or more standards are not satisfied, it is incumbent upon the institution to demonstrate and document clearly and unequivocally how the intent is met in some alternate fashion.
If you are having more than one program evaluated, particularly if the programs are on separate campuses, the answers to these questions may vary from one program to another. If this is the case, please use separate copies of each section for each program, and clearly delineate which program is being described.


  1. Objectives and Assessments


Intent: The program has documented, measurable objectives, including expected outcomes for graduates. The program regularly assesses its progress against its objectives and used the results of the assessments to identify program improvements and to modify the program’s objectives.

Standard I-1. The program must have documented, measurable objectives.

Standard I-2. The program’s objectives must include expected outcomes for graduating students.




  1. Objectives

Please attach items that support or precede the measurable objectives, e.g.,



  1. Mission statements from institution, college, department, program

  2. Plans (institution, college, department, etc.)

  3. All objectives including expected outcomes for graduates (itemize)

  4. Process for assessments

  5. Who is involved in assessment and improvement?

  6. Data from assessments

  7. Inputs from any supporting Office of Assessment

1. Indicate below or attach to this document the program’s measurable objectives. These objectives must include expected outcomes for graduates.




New Mexico Tech:
Located in Socorro, NM, this small state-funded university was founded as the New Mexico School of Mines in 1889. It was renamed New Mexico Institute of Mining and Technology in 1951 and is commonly known as New Mexico Tech (abbreviated as NMT) since the 60’s. The institute’s twelve departments offer twenty-two undergraduate programs and a number of graduate degrees at both masters and doctoral levels. The Computer Science department offers a B.S. degree in Information Technology in addition to B.S., M.S., and Ph.D. degrees in Computer Science.

Mission Statement:
The following appears in the institute’s 2007-09 Catalog. The institute catalog is printed and distributed to all students, staff, and faculty free of charge; prior to 2007, the catalog was printed every year; from 2007 onwards, it will be printed every two years but the Web version will be updated as necessary to reflect interim changes.
New Mexico Tech is an institute of higher learning that serves the diverse population of New Mexico by integrating education, research, public service, and economic development through emphasis on science, engineering, and natural resources. Its mission is multi-fold:


  1. helping students learn creative approaches to complex issues,

  2. acknowledging state and national diversity and developing an inclusive learning environment,

  3. creating and communicating knowledge, and

  4. solving technical and scientific problems.


Computer Science Program:
The B.S. in Computer Science program, a conventional on-campus four-year education program offered by the Computer Science Department, is under review for possible accreditation by the Computing Accreditation Commission (CAC) of ABET, Inc. The program has produced 444 graduates between 1966 (the year it was founded) and 2006; recent trends in undergraduate enrollment and graduation are summarized in Tables 1.1 and 1.2. In Spring 2007, the numbers of graduate students enrolled in the M.S. and Ph.D. programs in Computer Science were 42 and 18 respectively. A sample course sequence for the B.S. in Computer Science program is listed in Table 1.3; a corresponding flowchart (with vertical columns representing semesters, solid arrows showing prerequisites, and dotted arrows co-requisites) appears in Table 1.4.
Table 1.1: Enrollment trends

Year

Full-time Enrollment in

B.S. in Computer Science




2001

184

2002

207

2003

228

2004

220

2005

189

2006

172

Table 1.2: Graduation trends



Year

B.S. in Computer Science

Degrees Conferred


2000

22

2001

16

2002

23

2003

27

2004

24

2005

18

2006

28







Table 1.3: A sample course sequence for the B.S. in Computer Science program


Semester 1

4 MATH131 (calculus)

4 CS111 (introduction to computer science)

4 CHEM 121 & 121L (general)

3 ENGL111 (college English)

15 Total credit hours



Semester 2

4 MATH132 (calculus)

3 CS122 (algorithms and data structures)

4 CHEM122 & 122L (general)

3 ENGL112 (college English)

3 Social Science

17 Total credit hours

Semester 3

3 CS221 (computer systems)

5 PHYS121 & 121L (general)

3 MATH221 (discrete mathematics)

6 Electives

17 Total credit hours



Semester 4

3 CS222 (systems programming)

3 CS324 (programming languages)

5 PHYS122 & 122L (general)

3 Humanities

3 CS353 (data and computer communications)

17 Total credit hours
Semester 5

3 CS331 (computer architecture)

3 CS344 (algorithms)

3 MATH382 (probability and statistics)

1 MATH382L (probability and statistics)

3 ENGL 341 (technical writing)

3 Technical Elective

16 Total credit hours


Semester 6

3 CS326 (software engineering)

3 CS342 (formal languages)

3 CS382 (social issues)

3 Social Science

3 Technical Elective

2 Electives

17 Total credit hours



Semester 7

4 CS423 (compiler writing)

3 Humanities

3 Social Science

3 Technical Elective

3 Breadth Elective

16 Total credit hours

Semester 8

4 CS325 (operating systems)

3 Humanities/Social Science

3 Technical Elective

5 Electives

15 Total credit hours



Table 1.4: A flowchart for the B.S. in Computer Science sample course sequence







Mission Statement:
The mission of the Computer Science Program is to produce computer science graduates who, trained in the design, implementation, and analysis of computational systems and skilled in technical communication, will contribute towards the advancement of computing science and technology.


Program Educational Objectives:
Within a few years of graduating with a B.S. in Computer Science degree, our students will demonstrate their


  1. ability to design, implement, and analyze computational systems;

  2. capability to tackle complex computer science related problems in the real world;

  3. contribution towards the advancement of computing science and technology;

  4. capacity to work effectively with peers in computational tasks; and

  5. cognizance of ethical, social, and legal issues pertaining to computer science.



Program Educational Outcomes:
The undergraduate academic program in Computer Science will enable our graduates to acquire by the time of their graduation:


  1. a) the ability to design, implement, and test small software programs;

b) the ability to design, implement, and test large programming projects;

  1. knowledge of the theoretical concepts of computing;

  2. knowledge of the fundamental principles of programming languages, systems, and machine architectures;

  3. exposure to one or more computer science application areas;

  4. technical communication skills in written and oral form;

  5. the capacity to work as part of a team; and

  6. awareness of the legal, ethical, and societal impact of developments in the field of computer science.

The department’s mission statement, program objectives and program outcomes appear in the department’s website http://www.cs.nmt.edu as well as in the 2007-09 institute catalog but with two exceptions: in the catalog, (a) program outcomes 1a and 1b are combined into one phrase and (b) the word legal is absent in the rendition of Outcome 7. The motive for (a) is to ensure a smoother reading given the common prefix in the text of Outcomes 1a and 1b. As a result of further discussions between faculty and later with the Industrial Advisory Board in Spring 2007, it was decided that the legal impact of developments in the field is an important component that should also be reflected in the program outcomes.



Link between Program Objectives and Outcomes:
Each program objective is tied to program outcomes as shown in the following table.

Table 1.5: Program Objectives and Program Outcomes







Outcome 1a

Outcome 1b

Outcome 2

Outcome 3

Outcome 4

Outcome 5

Outcome 6

Outcome 7


Objective 1

X

X

X

X













Objective 2

X

X

X

X













Objective 3

X

X

X

X

X










Objective 4
















X

X




Objective 5







X

X










X

Students are enabled to fulfill the first objective (ability to design, implement, and analyze computational systems) by achieving program outcomes 1a, 1b, 2, and 3, since they provide the needed fundamentals. The second objective (capability to tackle complex computer-science-related problems in the real world) is also enabled by those program outcomes for the same reason. The third objective (contribution towards the advancement of computing science and technology) should be achieved given the same outcomes (1a, 1b, 2, 3) together with 4, which addresses exposure to applications. The fourth objective (capacity to work effectively with peers in computational tasks) will be achieved by students attaining Outcomes 5 (technical communication) and 6 (team work). The fifth and final objective (cognizance of ethical, social, and legal issues pertaining to Computer Science) is linked not only to Outcome 7 which addresses the same issues but also to Outcomes 2 and 3 since the knowledge of the fundamentals of computing is crucial to peer behind the veil of all that appears to be computable and discern the feasibility of suggested computational solutions, techniques, and approaches.


2. Describe how the program's objectives align with your institution's mission




Program objectives and the institution’s mission:

Figure 1.1: Institute Mission and Program Objectives



The objectives are consistent with the institute’s mission as depicted in Figure 1.1.


Computational problems require creative approaches; thus the first two objectives address the first mission through the development of a suite of skills and abilities. The last three objectives aim to create and communicate knowledge while remaining cognizant of ethical, social, and legal issues; hence they amplify the third mission (creating and communicating knowledge). The second, third, fourth, and fifth objectives empower our graduates to fashion responsible solutions to technical and scientific problems by effectively contributing their technical abilities while being mindful of societal impacts, thus fulfilling the fourth mission (solving technical and scientific problems). As a consequence of the last two objectives, our graduates will strive towards a workplace that accommodates diversity, promotes accessibility, and overcomes disabilities among peers, which will foster the second mission (acknowledging state and national diversity and inclusive learning environments).

Preparing for the Accreditation Process:
The decision was made to start working towards an ABET CAC accreditation in a meeting on December 8, 2005. Since then, from Spring 2006 to Fall 2006, the faculty has educated itself on the process by working together, by consulting with experienced colleagues, and through a visit by two of its members to the CAC Summit Meeting at Tampa Bay, Florida, in November 2006. Within this time, interrupted only slightly by the department’s move from Speare Hall to a newly rebuilt Cramer Hall in August 2006, concrete implementations took shape steadily, e.g., moving from a mix of assessment methods (insight from course evaluations, direct communication from students, feedback via surveys within a course, instructor’s observations, and student scores in examinations or projects) prior to 2005 to a quantitative assessment formula in Spring 2006, surveys for graduating seniors and alumni in Fall 2006, and a functioning Industrial Advisory Board in Spring 2007. The initial steps taken were to


  1. Define the stakeholders for the program;

  2. Articulate the program’s goals in the form of longer-term objectives and shorter-term outcomes; and

  3. Create a process for periodic review of the state of the program ensuring that

    1. quantitative measures were used for assessment in addition to informal observations,

    2. the results were integrated with other faculty decisions, and

    3. representatives of all stakeholders were involved.



Defining Stakeholders:
The faculty decided on the primary stakeholders of the program:


  1. Current undergraduate Computer Science majors,

  2. Alumni who graduated with a B.S. in Computer Science degree,

  3. Computer Science faculty, and

  4. Potential and actual employers of our students.

Table 1.6 lists these stakeholders along with our modes of interaction with them.


In addition, we agreed that we had secondary stakeholders with whom our interactions were informal: parents of students and the broader New Mexico community. We interact with parents informally during their visits with us; a member of the Industrial Advisory Board, Ron Tafoya, is particularly attuned to the broader community of New Mexico.
The Industrial Advisory Board:
An Industrial Advisory Board (IAB) was set up and convened in Spring 2007. As indicated in the Table 1.7, its members were chosen with care to represent potential and actual employers of our students; it includes Ms. Cherish Franco, who is a current undergraduate Computer Science major.
Table 1.6: Primary Stakeholder interaction.

Primary Stakeholder
Interaction mechanism

Undergraduate Computer Science majors

  • Academic advising by faculty;

  • Feedback via a Senior Exit Survey on graduation; and

  • A member in the IAB who will report personal feedback from peers, encourage their input into periodic surveys, and help in interpreting the results.

Alumni who graduated with a B.S. in Computer Science degree

  • Feedback from alumni via an Alumni survey: this is being facilitated by the creation of an Alumni database; and

  • Representation on the Industrial Advisory Board.

Computer Science Faculty

  • Faculty involvement in all these activities, in review, and decisions.

Potential and actual employers of our students

  • Periodic meetings with an Industrial Advisory Board.

Table 1.7: Industrial Advisory Board (IAB)

IAB Member
NMT Alumnus? Year
Organization

David Duggan

B.S. in Computer Science alumnus

1980


Sandia National Labs

Cherish Franco




New Mexico Tech

Alex Kent

B.S. in Computer Science alumnus

1995


Los Alamos National Lab.

Greg Lindhorst

B.S. in Computer Science alumnus

1989


Microsoft Corp.

Nicholas D. Pattengale

B.S. in Computer Science alumnus

2001


Sandia National Labs

Desh Ranjan




New Mexico State University

Karthikeyan Ramamoorthy

M.S. in Computer Science alumnus

2005


Yahoo! Inc.

Ronald Tafoya




Intel Corp.

Defining Program Objectives and Program Outcomes:
In Spring 2006, the faculty formulated an initial list of Program Educational Objectives and Program Outcomes for the B.S. in Computer Science program; the list was revised in Fall 2006 by the addition of a program outcome regarding applications and some minor re-wording. These lists were then submitted to all students taking junior and senior courses in Fall 2006 for approval through a questionnaire (Attachment 1) that also invited open-ended comments. The students provided not only overwhelming approval, but also useful discussion that led to a minor change in wording. Thus, with minor modifications and another round of discussions, the faculty decided to go ahead with the list of Program Objectives and Program Outcomes. In Spring 2007, the Industrial Advisory Board was asked to draw upon the collective experience, wisdom, perspective, and vision of its members and evaluate the appropriateness of these objectives and outcomes in capturing the skills, abilities, and traits required of our graduates to achieve success in the real world not only today but in the foreseeable future; the Board approved. Furthermore, in the Alumni Survey (Attachment 3), a question (Question 4 under Program Objectives) was included asking our alumni for input into the appropriateness of these lists; the alumni agreed (see Section I.C Page 31).
Defining the Process for Assessment:
Assessment is a continuous process. We have adopted a calendar (Table 1.8) whereby, on a periodic basis (each semester and annually), program educational objectives and outcomes are reviewed for continued appropriateness, assessment data are collected, distilled into useful information and discussed by the faculty, and decisions are taken and their efficacy reviewed later.
Defining the measurements:
Since program objectives are long-term goals, we measure them through our alumni survey. In Spring 2007, we decided to target those who graduated from our program in 2002 and 2003, a group for which the elapsed time since earning their B.S. degree has been long enough for manifestation of attributes such as our objectives and short enough that their undergraduate experience remains reasonably pertinent. (In Spring 2008, we will target those who graduated in the academic year 2003-2004). The problem with asking an alumnus if he/she has achieved a particular objective is that the answer is an opinion, in fact, a self-evaluation. Such indirect measures are less objective than direct measures, which are rooted in evidence. Hence, after consultation with more experienced colleagues, we included a question (Question 2 under Program Objectives) that asks for a description of an occasion, if any, in which the objective was attained; this provides a more direct measure. Thus, we decided to gather both an indirect as well as a more direct measure of the attainment of program objectives by our alumni.
Similarly, we decided on two distinct measures of the achievement of program outcomes, indirect and direct: (a) we administer an exit survey in which we ask for a self-evaluation, (b) we compute numeric scores for program outcomes from graded student work in courses. At this time, we have restricted ourselves to mandatory courses because the elective sequences taken by students are unpredictable. We explain this process below.

Table 1.8: Spring and Fall Assessment Calendar


Spring

Beginning

Faculty reviews the exit surveys filled in the previous semester;

Beginning

Faculty discusses course assessments;

Beginning

An assessment summary is prepared for NMT for the previous year;

Middle

Alumni surveys are sent out;

Middle

Faculty reviews the state of the program including efficacy of any past actions resulting from assessment;

Middle

IAB meeting: faculty presents its findings and proposed changes;

Middle

Faculty analyzes the IAB recommendations and makes decisions;

End

Faculty reviews the return rate for alumni surveys;

End

Senior Exit surveys are administered;

End

Instructors submit assessment reports for courses.

Fall


Beginning

Faculty reviews the exit surveys filled in the previous semester;

Beginning

Faculty examines information gleaned from the returned Alumni survey;

Beginning

Faculty reviews the program educational objectives and outcomes for continued appropriateness and requests IAB input on the same;

Beginning

Faculty discusses course assessments;

Middle

IAB meeting: faculty presents new findings and proposed changes;

Middle

Faculty sends curriculum changes, if any, to the NMT Faculty Senate;

End

Senior Exit surveys are administered;

End

Instructors submit assessment reports for courses.

Computing numeric scores for program outcomes:


We chose to compute a score for a program outcome by combining results from contributing courses. We ruled out the use of rubrics such as “more than 70% of students in each contributing course must perform satisfactorily” because they are hard to combine (if satisfied by two courses it does not imply that 70% of students performed satisfactorily in both of those courses). Instead, we chose to compute a numeric score for a program outcome through a linear weighted sum of numeric scores from more than one selected contributing course and compare against a pre-defined goal.
We started by creating a matrix whose rows are the computer science courses and columns the program outcomes. In each cell of the matrix, we arrived (by consensus) at a number between 1 and 3 to represent the degree of contribution of a course (the row) towards an outcome (the column), i.e., the impact factor of the course for that program outcome, with the interpretation listed in Table 1.9. Note that we only consider graded components of a course; thus, courses that reinforce or extend coverage, but do not grade related material, do not have an entry.

Table 1.9: Interpretation of the degree of contribution of a course towards program outcomes




Contribution

Interpretation

1

Introductory / preliminary

2

Reinforcement / extension / application

3

Major component

The result is shown in Table 1.10. This table was then restricted to the mandatory courses only (as explained earlier) to get Table 1.11. That table, in turn, was pruned further by retaining the major components (value of 3); however, in case of insufficient entries along columns, values of 2 and 1 were also retained; the resulting matrix is shown in Table 1.12.


With this restriction, the net numeric score for the jth program outcome is a normalized weighted sum


where the weights nij are the non-zero entries in the column for program outcome j in Table 1.12, and each value sij is a score that comes from the assessment of the ith course specifically for the jth program outcome. For example, Table 1.12 shows that Program Outcome 5 (Technical Communication) will be measured using three courses CS326 Software Engineering, CS423 Compiler Writing, and CS331 Computer Architecture, with impact factors of 3, 3, and 2 respectively. If the numeric scores assessed by those three courses are 3, 2, and 4 respectively, then the score computed for Program Outcome 5 is given by (3*3 + 2*3 + 4*2)/(3+3+2), i.e., 2.88.

We decided to limit the score sij (reported by a course i for a program outcome j) to a number between 1 and 4 with the interpretation given in Table 1.13. (We started with a 1..3 scale, and then realized the need for a bigger range.) Since the score for a program outcome is a normalized weighted sum, it has the same range and the same interpretation.



Table 1.10: Contribution of courses towards Program Outcomes


Course

Optional?

Course Title

1a. Small Prog.

1b. Large Prog.

2. Theory

3.PL/Sys/Arch.

4. Applications

5. Tech.Comm.

6. Team work

7. Ethics

CS111




Intro to CS & Programming

3




1

1

1










CS122




Algorithms & Data Structures

3




1

1

1




1




CS221




Computer Systems Organization

2




2

3

1







1

CS222




Systems Programming

2

2

1

3

2










CS324




Principles of Progr. Languages

3




2

3

2










CS325




Operating Systems




3




3

1

1

3




CS326




Software Engineering




3

1

2




3

3




CS331




Computer Architecture

3







3

1

2







CS342




Formal Languages & Automata







3




1










CS344




Design & Analysis of Algorithms







3




1










CS353




Data & Computer Communication




2

2

3

2










CS382




Legal, Ethical, and Social Issues of I.T.






















3

CS423




Compiler Writing

2

3

2

2




3

3




CS328



Secure Software Construction




3

1

2

3

3

3




CS209



Programming Language Practicum

2







2













CS351



Modeling & Simulation Technologies for Info. Systems

2

2

2

2

3







2

CS373



Introduction to Database Systems

2




2

2

2










CS391



Internet and Web Programming

2

2




3

3

2

2




CS441



Cryptography and Applications




2

3




3

2

2




CS451



Introduction to Parallel Processing

2

2

2

3

2

2







CS453



Computer Networks & the Internet




2




2

2










CS454



Computer Graphics

2

2

2

2

2

2







CS463



Information Assurance

2




2

2

3

2

2

2

CS464



Introduction to Soft Computing

2




3

2

2

1

2




CS476



Visualization













3

2




2

CS489



Spl. Topics: Digital Forensics

2







2




2




2

CS491



Directed Study (not usable for graduation)

















Table 1.11: Contribution of mandatory courses only towards Program Outcomes




Course

Course Title

1a. Small Prog.

1b. Large Prog.

2. Theory

3. PL/Sys/Arch.

4. Applications

5. Tech. Comm.

6. Team work

7. Ethics

CS111

Intro to Comp Science & Programming

3




1

1

1










CS122

Algorithms & Data Structures

3




1

1

1




1




CS221

Computer Systems Organization

2




2

3

1







1

CS222

Systems Programming

2

2

1

3

2










CS324

Principles of Programming Languages

3




2

3

2










CS325

Operating Systems




3




3

1

1

3




CS326

Software Engineering




3

1

2




3

3




CS331

Computer Architecture

3







3

1

2







CS342

Formal Languages & Automata







3




1










CS344

Design & Analysis of Algorithms







3




1










CS353

Data & Computer Communication




2

2

3

2










CS382

Legal, Ethical, and Social Issues of Information Technology






















3

CS423

Compiler Writing

2

3

2

2




3

3




A number in bold font indicates that the contributing course (row) has been chosen as a principal contributor for the program outcome (column) and will be retained for the next round of pruning.


Table 1.12: Principal components in the computation of Program Outcome scores





Course

Course Title

1a Small Prog.

1b. Large Prog.

2. Theory

3. PL/Sys/Arch.

4. Applications

5. Tech. Comm.

6. Team work

7. Ethics

CS111

Intro to Comp Science & Programming

3






















CS122

Algorithms & Data Structures

3






















CS221

Computer Systems Organization










3










1

CS222

Systems Programming










3













CS324

Principles of Programming Languages

3







3

2










CS325

Operating Systems




3




3







3




CS326

Software Engineering




3










3

3




CS331

Computer Architecture

3







3




2







CS342

Formal Languages & Automata







3
















CS344

Design & Analysis of Algorithms







3
















CS353

Data & Computer Communication










3

2










CS382

Legal, Ethical, and Social Issues of Information Technology






















3

CS423

Compiler Writing




3










3

3




Table 1.13: Interpretation of a numeric score sij (from course i for program outcome j)




Program Outcome score

Interpretation

1

Unsatisfactory

2

Marginal

3

Satisfactory

4

Excellent

Regarding the computation of a score sij (from a course i for a program outcome j), we considered the option of directly associating specific gradable items in the course with that program outcome and inferring sij from the student scores in those items. However, we wanted the coverage to be as comprehensive as possible. Hence, we decided to start with the set of learning outcomes for the course i and pick a subset to associate with the program outcome j. This is more robust as it provides an easier reality check to people other than the instructor of the course: a perusal of a subset of course learning outcomes to judge whether or not a program outcome is indeed covered is easier than reading various project specifications and questions in examinations because the latter typically involves in-depth knowledge of the subject matter and its subtleties.


Regarding the combination of scores in the various gradable items, we chose a linear weighted sum for similar reasons. The only difference here is that we quantize to a 1..4 scale. The procedure was to identify the largest disjoint set of course learning outcomes corresponding to the program outcome at hand. For example, Figure 1.2 shows that a program outcome P1 is measured by courses C1, C2, and C3, with impact factors of 3, 3, and 2 respectively; we would identify the subset of course learning outcomes
L = {l11, l12, l15}
as relevant to course C1 and program outcome P1. For that set L,


  1. The instructor decides on a performance metric to interpret an average score for a course outcome as unsatisfactory, marginal, satisfactory, and excellent, the basis for a four-point scale; this takes care of variations among courses in grading, e.g., relative versus absolute, partial credit versus all-or-none grading.

  2. Each course outcome l in L is tied to gradable items in the course, e.g., a project, specific questions in the final exam. In Figure 1.2, course outcome l11 would be tied to questions q11 and q12.

  3. Weights are assigned to these questions or items (in Figure 1.2, 0.7 and 0.3 for questions q11 and q12 respectively); using them, a formula is written to compute a normalized weighted sum from the scores for those questions or items;

  4. From a table of scores obtained by the students in those gradable items, one numeric score is computed for each student per course outcome l.

  5. Those numeric scores are then averaged over the whole class to get one numeric score pl for each course outcome l.

  6. Using the performance metric, a number ql is obtained by quantizing pl to a four-point scale.

  7. The above is repeated for each l in L.

  8. The scores ql (in the four-point scale) are averaged over all l in L.

The result is the numeric score for the program outcome.


Faculty members have used spreadsheets for such computation. A computer program has also been written to calculate results. The process has the advantage of low overhead: with a little advance planning, the calculations can be done from the grade sheet that the instructor already possesses at the end of the semester with minimal additional time commitment.
The course assessment report documents the weights, formulas, performance metrics, and the process, along with any conclusions drawn.
The implementation of this process has been gradual. The main deviation from the process has been related to disjointness. In a few courses in Spring 2006, the same graded item has been used for more than one learning outcome; in Fall 2006, some courses have used the same course learning outcome (and hence the same graded items) for more than one program outcome. These are documented in the course assessment report that accompanies the course display binder. We expect to have a complete implementation of disjointness in Fall 2007. In two courses (CS111 and CS382), the program outcome has been computed directly from student scores in specific assignments.
Figure 1.2: Computing Program Outcome scores from graded items in courses


Note: On the following page is a table that can be filled out with pertinent information relating to objectives, their measurement, and their effect on the implementation of program improvement.




  1. Implementation of Objectives

Please complete the following table with as many objectives as needed.


Table 1.14: Improvements based on recent assessment

Objective

How measured

When measured

Improvements

Identified



Improvements

Implemented



Program Objective 1:

ability to design, implement, and analyze computational systems;



Alumni Survey Q1(indirect)
Alumni Survey Q2 (direct)

Spring 2007

Spring 2007



None needed

None needed






Program Objective 2:

capability to

tackle complex computer science related problems in the real world;


Alumni Survey

Q1(indirect)


Alumni Survey Q2 (direct)

Spring 2007

Spring 2007



None needed

None needed






Program Objective 3:

contribution towards the

advancement of

computing science and

technology


Alumni Survey

Q1(indirect)


Alumni Survey Q2(direct)

Spring 2007

Spring 2007



The scores for Q1 and Q2 are both low. From an analysis of the written responses, we feel that this survey question was misunderstood; it needs to be augmented with some examples to remove the possible misconception that contributions did not qualify unless they were impressively lofty.

(Section I.C; Page 30)



The survey is being redesigned. If the score persists, we will examine further.

Program Objective 4:

capacity to work effectively with peers in

computational tasks; and


Alumni Survey

Q1(indirect)


Alumni Survey Q2(direct)

Spring 2007

Spring 2007



None needed

None needed






Program Objective 5:

cognizance of

ethical, social,

and legal issues related to

computer science.


Alumni Survey

Q1(indirect)


Alumni Survey Q2(direct)

Spring 2007

Spring 2007



None needed.

The score is not satisfactory. As in Objective 3, from analysis of the written responses, we feel that the wording of the question in the survey needs to be improved. (Section I.C Page 31)



The survey is being redesigned. If the score persists, we will examine further.

Program Outcome 1a:

the ability to design, implement, and test small

software programs


Combining assessment scores from relevant courses


Spring 2006 through Spring 2007

There are no concerns at the program level. Individual courses are making adjustments based on feedback.





Program Outcome 1b:

the ability to

design,

implement, and test large

programming projects


Combining assessment scores from relevant courses

Spring 2006 through Spring 2007

There are no concerns at the program level. Individual courses are making adjustments based on feedback.




Program Outcome 2:

knowledge of

the theoretical

concepts of

computing


Combining assessment scores from relevant courses

Text of course assessment report.



Spring 2006, through Spring 2007

Fall 2006



Both contributing courses report a common problem: weakness in Discrete Mathematics.

CS122 students

were reported deficient in ‘proof by induction’, a course learning outcome.


The faculty met with the Chair of Mathematics in early 2007. Consequently, the Math department has started work on re-vamping this course to ensure an adequate minimum knowledge for a passing grade. In addition, our faculty will decide in Fall 2007 if there should be a change in the minimum grade required when this course is used to satisfy a prerequisite.
Although this topic is later covered in Discrete Mathematics, the faculty has decided to experiment with a guest lecture on this topic in Fall.

Program Outcome 3:

knowledge of the fundamental principles of

programming languages, systems, and

machine architectures



Combining assessment scores from relevant courses

Spring 2006 through Spring 2007

There are no concerns at the program level. Individual courses are making adjustments based on feedback.




Program Outcome 4:

exposure to one or more

computer science application areas


Combining assessment scores from relevant courses

Spring 2006 through Spring 2007

There are no concerns at the program level. Individual courses are making adjustments based on feedback.




Program Outcome 5:

technical communication skills in written and oral form



Combining assessment scores from relevant courses

Questionnaire to students in CS3xx and 4xx courses



Spring 2006 through Spring 2007

Fall 2006



There are no concerns at the program level. Individual courses are making adjustments based on feedback and also ensuring that this program outcome can be graded by itself (see commentary with Table 1.27 Page 36).
Students expressed a need for greater opportunities for presentations and communication.

This was addressed through the immediate addition of a CS489 Senior Seminar course in Spring 2007.



Program Outcome 6: the

capacity to work as part of a group



Combining assessment scores from relevant courses

Spring 2006 through Spring 2007


There are no concerns at the program level. Individual courses are making adjustments based on feedback and also ensuring that this program outcome can be graded by itself (see commentary accompanying Table 1.28 Page 36)




Program Outcome 7:

cognizance of

ethical, social,

and legal issues pertaining to

computer science.



Combining assessment scores from relevant courses

Spring 2006, through Spring 2007

The score is unsatisfactory (Table 1.29). First, the assessment method for CS382 Legal, Ethical, and Social Issues in Information Technology will be improved. Second, students will be advised to follow the sample curriculum and take CS326 Software Engineering with this course or prior to enrolling in this course. Third, faculty members will work on discussing ethical issues that intersect with topics in the Computer Science courses they teach.

The changes will be implemented from Fall 2007. The instructor of CS382 will be asked to compare the performance of students who have taken CS326 against those who have not.

While the above table (Table 1.14) is based on more recent precise measures, the table below (Table 1.15) lists changes that were based on a mix of methods and less than precise measures that we used prior to 2006: insight from course evaluations, direct communication from students, feedback via surveys within a course, instructor’s observations, and student scores in examinations or projects.


Table 1.15: Improvements based on assessment prior to 2006




Objective

How measured

When measured

Improvements

Identified



Improvements

Implemented



Program Outcome 1b:

the ability to design, implement, and test large programming projects (although the objective was not articulated at the time)



CS423 Compiler Writing course evaluations as well as instructor observation of success in compiler projects.

Fall 2003


The main problem was weakness in software engineering skills.

This was initially addressed in 2004 through extra class time on these topics but project deadlines became more stressful. Later, in Fall 2005, the assignment sequence was re-designed; this led to more favorable course evaluations, but many students still were not able to substantially accomplish the project.

In 2006, this was addressed at the program level through the addition of a new course CS326 Software Engineering first taught in Spring 2007.

We have assessed this new course using precise measures, and found that it did not yet address this program outcome satisfactorily (Table 1.23). The instructor has traced the problem to the students’ inability to grasp certain concepts like design patterns and object design. These concepts will be given increased emphasis next year. While we do not expect to see the full impact of this course before 2008, we view this as a success of our assessment process.


Program Outcome 2:

knowledge of the theoretical concepts of computing



CS344 Design and Analysis of Algorithms: Direct measure from examination question on NP-completeness and an indirect measure by a post-course student self-evaluation.

Fall 2004

Direct measures confirmed that the topic in question was not grasped by at least 50% of the students. Self-evaluations matched direct measures for those in the middle but were skewed at the two extremes.

The instructor decided to explore this further by spending more time on this topic and engaging more students in discussion in 2005. But such discussion led to the suspicion that the students’ preparation in discrete mathematics was the more fundamental stumbling block. Consequently, an announced quiz on the fundamentals of discrete mathematics was given to all students in the second week of class in Fall 2006. The results confirmed the suspicion. Subsequent assessment and actions are described in Table 1.14 above.


Standard I-3. Data relative to the objectives must be routinely collected and documented, and used in program assessments.

Standard I-4. The extent to which each program objective is being met must be periodically assessed.

Standard I-5. The results of the program’s periodic assessment must be used to help identify opportunities for program improvement.




  1. Assessments

For each instrument used to assess the extent to which each of the objectives is being met by your program, provide the following information:




  1. Frequency and timing of assessments

  2. What data are collected (should include information on initial student placement and subsequent professional development)

  3. How data are collected

  4. From whom data are collected (should include students and computing professionals)

  5. How assessment results are used and by whom

Attach copies of the actual documentation that was generated by your data collection and assessment process since the last accreditation visit, or for the past three years if this is the first visit. Include survey instruments, data summaries, analysis results, etc.



Earlier, in Table 8, we have indicated the frequency and timings of our periodically scheduled survey instruments. Below, we list for each assessment instrument, our answers to the five questions regarding frequency/timing and use of assessment plus what, how, and from whom data are collected. Overall, we have collected data from current and graduating majors, alumni, faculty, as well as the Career Placement Office. The assessment results are used by faculty and the Industrial Advisory Board. We receive and consider data from the alumni as well as feedback from the Industrial Advisory Board, the large majority of whom are computing professionals.


Assessment instruments:
Career Placement Data


Frequency and timing of assessment

Each semester starting in Spring 2007.

What data are collected

Placement status.

How data are collected

Graduating students fill out the survey (see below).

Also, emails are sent to those who have graduated.



From whom data are collected

Students graduating from the B.S. in Computer Science program and those who have graduated.

Data from the Career Placement Office.




Who uses assessment results and how

Faculty discusses the data and presents them to the Industrial Advisory Board.

The preamble to the Computer Science entry in the catalog states that “graduates of the Computer Science bachelor’s program will be well prepared for both industry employment and graduate study”. Starting in Spring 2007, we have started collecting placement data from graduating students.


Thirteen students graduated in May 2007. Six are pursuing a Masters degree in Computer Science: four at NMT, one at Columbia University and one at The University of California at San Diego. Three are employed: at DoD, SPAWAR, and Pathfinder Energy Services. Of the four others, one is interning, two are seeking employment, and one deceased. Within a month from graduation, only two of thirteen have neither found employment nor been accepted in graduate school. We interpreted this as a high assessment of the quality of the program.

Student Questionnaire (Attachment 1):


Frequency and timing of assessment

Each Fall semester commencing Fall 2006.

What data are collected

Level of agreement with proposed program objectives and outcomes, open-ended comments about the objectives and outcomes, and reasons for the ratings.

How data are collected

Questionnaire was handed out in class; students could return the answers to the instructor within two days; each instructor scanned and summarized the results; later, results were entered into a spreadsheet.

From whom data are collected

Undergraduate CS majors enrolled in CS3XX and CS4XX classes. 49 students responded.

Who uses assessment results and how

Faculty examined the levels of agreement and comments about the wording of the program objectives and outcomes. This led to a minor change in the wording before it was sent for entry in the new catalog.

Faculty also discussed the comments and made decisions.



The Industrial Advisory Board reviewed these data and faculty decisions.

Figure 1.3: Student responses regarding agreement with the stated Program Objectives

Number of responses = 49



Figure 1.4: Student responses regarding agreement with the stated Program Outcomes

Number of responses = 49


The results of the Fall 2006 questionnaire indicate that students enrolled in junior and senior Computer Science courses overwhelmingly approved the Program Educational Objectives and Outcomes prepared by the faculty (See Figures 1.3 and 1.4).
Since this questionnaire, there has been one change: we have added the word legal to the text of the last program outcome; though this modification has not yet been ratified by the student body as a whole, it has been approved by the Industrial Advisory Board and specifically by the undergraduate student member of that Board.

Faculty discussed the comments made by students and took one major action: the addition of an elective Senior Seminar course as explained earlier (Page 17).



Senior Exit Survey (Attachment 2):


Frequency and timing of assessment

End of each semester since Fall 2006.

What data are collected

Answers to questions in a survey: indirect measure of achievement of program outcomes; feedback about the effectiveness of the advising process and satisfaction with the entire program; other questions, e.g., about placement; feedback regarding curriculum also being gathered for future analysis.

How data are collected

Graduating students fill out the survey in the examination week and hand them over to the department secretary who enters the data into a spreadsheet.

From whom data are collected

Students graduating from the B.S. in Computer Science program.

There were 14 responses so far (5 from Fall 2006 and 9 from Spring 2007).



Who uses assessment results and how

The faculty uses the spreadsheet to collect average and standard deviation information about achievement of program outcomes. The faculty examines and discusses the comments and answers to other questions. Faculty presents its findings to the IAB.

Performance Metric:




  • an average score of 4.0 or greater will be taken as satisfactory for all questions on a 5-point scale.

Fourteen students filled out an exit survey — five in the examination week of the Fall 2006 semester and nine in the similar week of the Spring 2007 semester. The results follow. While the sample size is small, the data were useful: first, we examined the comments instead of blindly relying on numbers; second, even with this small data set, we saw that certain data matched with data from elsewhere and this led us to investigate.


Self-evaluation of achievement of the program outcomes:

For the following statements about the achievement of program outcomes, students recorded their level of agreement using a 5-point scale:

1 (strongly disagree), 2 (disagree), 3 (neither agree nor disagree), 4 (agree), and 5 (strongly agree); Table 1.16 lists the average, standard deviation, and number of those responses.
Based on our performance metric, the results are satisfactory. Nevertheless, we have noted that the two lowest scores (Outcomes 1b and 5) are accompanied by the largest standard deviations. Clearly, there was more disagreement about the achievement of these two program outcomes. The addition of CS326 Software Engineering and CS489 Senior Seminar should help in addressing similar concerns in the future.

Table 1.16: Senior Exit Survey Result about Program Outcomes



Statements related to Program Outcomes

AVG

STD

DEV


NUM


1a. I am able to design, implement, and test small software programs.

4.71

0.5

14

1b. I am able to design, implement, and test large programming.

4

0.8

14

2. I understand the theoretical concepts of computing.

4.36

0.6

14

3. I understand the fundamental principles of programming languages, systems, and machine architectures.

4.43

0.5

14

4. I have gained exposure to one or more application areas within computer science.

4.71

0.5

14

5. I have technical communication skills in written and oral form.

4

0.9

14

6. I have the capability to work as part of a team.

4.57

0.5

14

7. I understand the ethical and societal impact of developments in the field of computer science.

4.43

0.5

14

Students were asked how effective was the process of academic advising by faculty and given a five-point scale to answer: 1(completely ineffective), 2 (ineffective), 3(neither effective nor ineffective), 4 (effective), and 5 (very effective). The responses can be summarized as:


Table 1.17: Senior Exit Survey result about academic advising





AVG

STD DEV

NUM

Effectiveness of the process of Academic Advising by faculty

3.57

1.09

14

By our measures, this is unsatisfactory. As we see in the sequel, there are other indications that many students are not finding the academic advising to be effective. The faculty has decided to work with Ms. Franco, the student IAB member, on this problem; in Fall 2007, we expect to get substantial feedback from current students about their perceptions of, expectations from, and suggestions regarding the advising process.

Students were asked how satisfied they were with the academic experience of earning the B.S. in Computer Science degree. The responses gave us the results in Table 1.18.

Table 1.18: Senior Exit Survey result about the academic experience








AVG

STD

DEV


NUM

Satisfaction with the academic experience

3.86

0.95

14

This too is unsatisfactory. A part of this dissatisfaction can be linked to advising, but we analyzed further. For those respondents who gave a low rating (2 or 3) for this question, we pondered over their answers to the next question, which asked how the program could be improved. The unhappiness seems to be with some lower-division courses such as CS221 and CS222, the need for more programming, scheduling programming assignments early in the courses, and more technical electives. We are aware of past problems with temporary instructors we had hired for those courses. The new software engineering course should address the need for programming. The faculty will examine the feasibility of scheduling programming assignments earlier in the semester in advanced courses. While the number of electives we can offer is limited by the number of faculty, Table 2.1 (Section II: Frequency of Course Offerings Page 43) shows that we have offered a reasonable mix and number of electives over the last six and a half years. The main problem seems to be that students have often not articulated their desire for elective offerings and thus felt disappointed when certain specific electives were not offered in their last semesters. We have devised a pre-approval form (Attachment 6) to be used by students entering in Fall 2007 onwards, to list their desired electives. This will enable the faculty to better satisfy the desire for specific electives.



Alumni Survey (Attachment 3):


Frequency and timing of assessment

Each Spring semester; one set of surveys was sent out in Spring 2007.

What data are collected

Answers to questions in a survey: indirect measure of achievement of program objectives plus direct measure of the same; feedback about the effectiveness of the advising process and satisfaction with the entire program; other questions about post-graduation success, NMT’s facilities, curriculum, etc.

How data are collected

A list of students graduating in a particular time window is obtained from the Registrar’s Office and the Career Placement Office (the department has started creating its own database); a survey form in paper form is sent to each; these are returned to the department secretary who enters the data into a spreadsheet.

From whom data are collected

Students who graduated with a B.S. in Computer Science roughly five years prior. Surveys were sent out to 37 alumni: one of whom had graduated in December 2001, twelve in May 2002, five in December 2002, eighteen in May 2003, and one in December 2003. A communication problem with the Career Placement Office resulted in a wider temporal range than we wished. We have received 7 responses.

Who uses assessment results and how

The faculty uses the spreadsheet to collect average and standard deviation information about achievement of program objectives.

The faculty examines and discusses the comments and answers to other questions. Faculty presents its findings to the Industrial Advisory Board.



Performance Metric:



  • An average score of 4.0 or greater will be taken as satisfactory for all questions on a 5-point scale.

  • At least 60% of respondents should report at least one occasion of attainment of the program objectives to be considered satisfactory.

So far, we have received seven surveys from our targeted set of 2002-and-2003 alumni (four had graduated in 2002 and 3 in 2003). One had earned an M.S. degree while one was in the process of earning a graduate degree.

Here is what we have learned about program objectives.
For an indirect measure, we posed the question, ‘Do you believe that at this point in time, if asked, you could demonstrate that you have …’ achieved each of the program objectives; the answers are based on a 5-point scale 1(NO –– I strongly disagree), 2 (no –– I disagree), 3 (yes-&-no –– I neither agree nor disagree), 4 (yes –– I agree), and 5 (YES –– I strongly agree). The result is tabulated in Table 1.19.

Table 1.19: Alumni Survey result about program objectives (indirect measure)




Program Objective

AVG

STD

DEV


NUM

1. the ability to design, implement, and analyze computational systems?

4.57

0.79

7

2. the capability to tackle complex computer science related problems in the real world?

4.43

0.53

7

3. contributed towards the advancement of computing science and technology?

3.43

1.13

7

4. the capacity to work effectively with peers in computational tasks?

4.86

0.38

7

5. cognizance of ethical, social, and legal issues pertaining to computer science?

4.57

0.53

7

Similarly, for a direct measure, we posed the question, ‘Consider the time interval starting from your graduation with a B.S. in Computer Science from NMT and ending today. During this interval, was there at least one occasion in which you …’ have achieved each of the objectives, and to describe the occasion, if any. The result is given in Table 1.20.


Table 1.20: Alumni Survey result about program objectives (direct measure)


Program Objective

YES %

1. the ability to design, implement, and analyze computational systems?

100

2. the capability to tackle complex computer science related problems in the real world?

100

3. contributed towards the advancement of computing science and technology?

43

4. the capacity to work effectively with peers in computational tasks?

100

5. cognizance of ethical, social, and legal issues pertaining to computer science?

57

Clearly, the scores for the indirect and direct measures for Objective 3 were unsatisfactory as was the direct measure for Objective 5. We explored the responses further in search of clues.


We analyzed other answers from those respondents who felt they had not contributed towards the advancement of computing science and technology. One was building graphical user interfaces for a very large software company; another was a system administrator; another had designed large software systems with high degree of modularity and flexibility using components that had to be re-usable in future systems; while a fourth had been involved in creating a mission planning system for an unmanned aerial vehicle, a software that allows pilots to control an airplane via a computer! We felt that the second had probably looked narrowly at his/her job and found only maintenance and upgrades instead of the broader impact of well-maintained computing facilities on the performance of the organization. Looking at the projects of the rest, we felt that they had perceived the bar for contribution too high. Thus, we came to the conclusion that the question needs to be re-worded and augmented with examples so that we can communicate to our alumni that ‘contribution’ did not have to be either award-winning or earth-shattering in order to qualify.
Regarding the question about cognizance of ethical, social, and legal issues pertaining to computer science, two respondents answered in the negative while one chose not to answer. All of these three however had strongly agreed with the statement that they could demonstrate that they had that necessary cognizance. Of the first two, one said that he/she was engaged in building user interfaces for a very large software company; the other was the alumnus who had worked on a mission planning system; the respondent who had left the answer blank was working on classified systems. We find it unlikely that work on user interfaces, mission planning, or classified systems are divorced from ethical, social, or legal issues. Here too, we feel that it would be helpful if examples of what we meant accompanied this survey question. However, in the case of the comment from an alumnus that he/she did not find the coverage of the ethics course to be usable (though it could be usable by others), we feel that it validates our decision to emphasize the universality and omnipresence of ethical, social, and legal issues in as many Computer Science courses as possible.
Students were asked to assess the quality of the following facilities using a 5-point scale:

1(very bad), 2 (bad), 3 (neither good nor bad), 4 (good), 5 (very good). The results:


Table 1.21: Alumni Survey result about facilities


Facilities

AVG

STD DEV

NUM

Classrooms

3.86

0.38

7

Advising/ mentoring resources

2.43

0.79

7

Computer hardware and software

3.57

0.98

7

NMT Library

4.0

0.82

7

We are not worried about the evaluation of the classrooms and the hardware and software facilities since we have now moved to a recently renovated building equipped with superior classrooms and new computing hardware and software. However, as mentioned earlier, we take the score for advising/mentoring seriously and have decided to investigate the adequacy of corresponding processes and resources.


Career Placement:

The alumni surveys received so far indicate that all our respondents were either employed in industry or in graduate school. A high rate of employment and/or education in graduate school is interpreted as a high assessment of the quality of the program.


Appropriateness of the list of program objectives:

There was no disagreement with the list of program objectives.


Feedback about the overall program:

There was general appreciation of the program with the reservation that teaching quality was not uniform, a suggestion that faculty should exchange their personal teaching philosophies, a couple of warnings that the curriculum should not only look good on paper but be implemented consistently. One commented about the need for hands-on lab work while another asked for more research opportunities for undergraduates and one applauded the software construction course project.


In our view, the comments about teaching reflect a period around 2001 marked by some less than satisfactory teaching quality by faculty who have since left. The remark about implementation of the curriculum probably reflects some unhappiness about a specific elective, which we are addressing with a pre-approval form (Attachment 6), as we have discussed earlier. The need for hands-on programming projects is being met with the new Software Engineering course. Unlike in the past, there are many opportunities for research by undergraduates beginning with a senior seminar course that focuses on research, and extending to opportunities to participate in research groups.




Program Outcome computation from courses:


Frequency and timing of assessment

Beginning of each semester.

What data are collected

Instructors of each undergraduate Computer Science course turn in a course assessment report at the end of each semester.

How data are collected

Numeric scores are computed for program outcomes from scores in relevant courses as per a matrix (Table 1.12); each such course instructor selects course learning outcomes appropriate for that program outcome; student scores in relevant graded items are then used to obtain a numeric measure.

From whom data are collected

Student scores on specific examination questions, projects, etc. are used by the instructor.

Who uses assessment results and how

The faculty reviews the numeric scores reported by each mandatory course and the computed scores for each program outcome.

Faculty also discusses the main findings from each course (mandatory and optional).

The numeric scores, the major findings from the text of the assessment reports, and consequent decisions taken are presented to the Industrial Advisory Board.



Performance Metric:

  • An average score of 3.0 or greater will be taken as satisfactory for all numeric scores computed from courses. (Note that these numeric scores are in a 1..4 range unlike the five-point scale we have been discussing for survey responses.)

Using the weighted-sum method outlined earlier (see Section I.A. Computing numeric scores for program outcomes; Page 13), we have obtained numeric scores for program outcomes from contributing courses (as per Table 1.12) taught in Spring 2007, Fall 2006, and Spring 2006. Below, we present our program outcome scores for the academic year 2006-2007. We are going to compile similar figures for each academic year in order to analyze changes from the previous years and discern trends. We have observed only one change in the contributing scores between Spring 2006 and Spring 2007; we remark on it in our commentary below.


Program Outcome 1a: the ability to design, implement, and test small software programs:
Table 1.22: Numeric assessment for Program Outcome 1a


Course

Score

Weight

Overall

Score

CS111

4

3

3.5

CS122

3

3

CS324

3

3

CS331

4

3

There are no concerns in this area.


Program Outcome 1b: the ability to design, implement, and test large programming projects:
Table 1.23: Numeric assessment for Program Outcome 1b


Course

Score

Weight

Overall

Score

CS325

4

3

3.1


CS326

2.3

3

CS423

3

3

Although the overall score of 3.1 is satisfactory, we have noticed a less than satisfactory score for CS326, the new software engineering course. This problem has been traced to an inadequate understanding of two course topics: software design patterns and object design. The treatment of these topics will be enhanced in the next offering of the course.


Program Outcome 2: knowledge of the theoretical concepts of computing:
Table 1.24: Numeric assessment for Program Outcome 2


Course

Score

Weight

Overall

Score

CS342

2.35

3

2.03

CS344

1.70

3

We have an unsatisfactory score for this program outcome. Instructors of both courses have identified a common problem: students’ preparation in discrete mathematics. This was confirmed through an announced quiz on the fundamental concepts of discrete mathematics given to entering students of CS344 Design and Analysis of Algorithms in Fall 2006; the scores clearly demonstrated that students were not prepared. This problem is being addressed with the cooperation of the Mathematics department. The outgoing Math department Chair has met with the Computer Science faculty and briefed the incoming Math Chair; they are discussing possible measures with the Math faculty for implementation starting in Fall 2007 (Spring 2008 if a change in the catalog is required). The alternatives include an adjustment of the syllabus in that course to provide greater emphasis on the fundamentals needed for these Computer Science courses and a must-pass segment in the final examination that covers those essential concepts. In addition, the Computer Science faculty is going to consider the addition of a minimum grade requirement when the discrete math course is used to satisfy a pre-requisite for CS342 or CS344.


The low score for CS342 Formal Languages and Automata is not normal: its score in Spring 2006 was 3. The reason for this low score in Spring 2007 is that one topic (Turing machines and recursively enumerable languages) was not covered owing to shortage of time. This will be addressed by reordering the lecture schedule when the course is offered in Spring 2008.

It is worth noting that the low score for CS342 in Spring 2007 shows that our scheme is working as desired: an anomaly affecting one important topic in one course was able to register an effect on the overall score for the course thanks to the comprehensive nature of our quantitative assessment method. As explained earlier (in Section I.A. Computing numeric scores for program outcomes; Page 13), the numeric score for a program outcome is a weighted sum of contributions from relevant course learning outcomes, which, in turn, are computed from gradable items. Since a course learning outcome (Turing Machines) was not supported by any graded item, it contributed a zero to the program outcome, reducing its score. Had we restricted our procedure to scores in questions asked and graded (necessarily on topics actually covered in the course) instead of what was promised in the syllabus, we could never have illuminated such a lack of coverage.


Program Outcome 3: knowledge of the fundamental principles of programming languages, systems, and machine architectures:
Table 1.25: Numeric assessment for Program Outcome 3


Course

Score

Weight

Overall

Score

CS221

3

3

3.2


CS222

3

3

CS324

3

3

CS325

3.67

3

CS331

3.5

3

CS353

3

3

We have no concerns in this area.


Program Outcome 4: exposure to one or more application areas within computer science:
Table 1.26: Numeric assessment for Program Outcome 4


Course

Score

Weight

Overall

Score

CS324

3

2

3

CS353

3

2

We have no concerns in this area.




Program Outcome 5: technical communication skills in written and oral form:
Table 1.27: Numeric assessment for Program Outcome 5


Course

Score

Weight

Overall

Score

CS326

3

3

3.33


CS331

4

2

CS423

3

3

We have no concerns in this area.


However, in CS423 Compiler Writing, the compiler project was used to assess both Program Outcomes 1b and 5, i.e., the graded items used for two Program Outcomes were not disjoint. We have discussed this and come to the conclusion that though the implementation of large programs may depend on technical communication, the latter does not necessarily depend on the former. Thus, commencing in Fall 2007, points will be set aside for the technical communication aspect. Thus, we will be able to assess Program Outcome 5 without being affected by any other Program Outcome.
The implementation of a large project will still be dependent on success in technical communication and team work; this is the consequence of what large program implementation entails and, more generally, the very nature of computing. We will tackle this while interpreting scores for Program Outcome 1b: if we get an unsatisfactory score, we will examine these related component outcomes as well.

Program Outcome 6: the capacity to work as part of a team:
Table 1.28: Numeric assessment for Program Outcome 6


Course

Score

Weight

Overall

Score

CS325

NOT GRADED

3

3.5

CS326

4

3

CS423

3

3

We have no concerns in this area. The instructor of CS325 Operating Systems did not allocate a separate grade for team work. This will be addressed in future offerings of the course. The comments regarding CS423 Compiler Writing in the previous item (Program Outcome 5) apply here as well.


Program Outcome 7: awareness of the ethical and societal impact of developments in the field of computer science:
Table 1.29: Numeric assessment for Program Outcome 7


Course

Score

Weight

Overall

Score

CS221

3

1
2.25




CS382
2

3

The score is unsatisfactory.


CS382 Legal, Ethical, and Social Implications of Information Technology was assessed in a slightly different manner: the average scores of students in certain assignments were averaged directly, i.e., bypassing the course learning outcomes, because assignments were not separated by those outcomes. It was noted that though the average was low, when restricted to the students who actually turned in the assignments, the average was much higher. The situation was unchanged in Fall 2006 and Spring 2007.
The instructor of this course had made two program-level recommendations, which have been discussed by the faculty. The first recommendation was to make CS326 Software Engineering a pre- or co-requisite for this course because once students are exposed to large code projects, they can be expected to acquire an enriched appreciation for the problem of software reliability and associated legal and ethical issues. The department has decided to strongly advise its students either to follow the sample curriculum and take these two courses together or to take CS326 before CS382. From Fall 2007 onwards, the instructor of CS382 will be asked to compare the performance of the students who have taken (or are taking) CS326 against those who have not. Based on this evaluation, the department will revisit the possibility of a co-requisite. The second recommendation was that other department courses should include a component (whether graded or not) addressing the value of ethics. The faculty members have agreed to include discussion on ethical issues that intersect with the material of their courses.

Standard I-6. The results of the program’s assessments and the actions taken based on the results must be documented.





  1. Program Improvement

Describe your use of the results of the program’s assessments to identify program improvements and modifications to objectives.



Include:


  1. Any major program changes within the last five years

  2. Any significant future program improvement plans based upon recent assessments




Assessment-based changes:


  • In Fall 2006, the questionnaire completed by undergraduates taking junior and senior Computer Science courses clearly indicated that the students felt that they needed more opportunities for developing technical communication skills (Page 17). Consequently, a CS489 Senior Seminar course was offered in Spring 2007. The effectiveness of this course will be judged in two ways. First, from Fall 2007, our Senior Exit Survey will include a question asking for a self-evaluation regarding how effective this course has been in imparting the necessary skills. Second, since all of the Spring 2007 Senior Seminar students have already taken CS423, in the future, the CS423 Compiler Writing course will closely observe and evaluate the performance of students who have taken this course, compared to those who have not.




  • In Fall 2003, assessment of CS423 Compiler Writing revealed that many students failed to complete the final project and complained that the amount of work involved was unreasonable. The instructor found that students were developing programs inefficiently, doing a great deal of unnecessary (and sometimes repetitive) work, and concluded that the lack of software engineering was the major cause. After many in-course corrections were attempted with varying degrees of success (Page 19), faculty agreed in 2005 that the problem needed a program-level solution: the addition of a required course on software engineering that would have synergistic benefits on other courses. Consequently, CS326 Software Engineering was added as both a required course and a prerequisite for CS423. It was offered for the first time in Spring 2007.




  • In 2005, during faculty discussion on the curriculum, it was pointed out that since CS453 Computer Networks had always been an elective and there was no network-related course that was mandatory, some students had no exposure at all to the basic concepts of data communication that form the foundation of computer networking. Furthermore, since the department wished to initiate a Computer Engineering program in the future, a course involving the basic concepts of data communication would be beneficial. Consequently, CS353 Data Communications was added as a required course in 2005-06 so that all students could get an adequate knowledge of computer communication, and thus acquire a foundation for computer networking. In Spring 2006, faculty observed from assessment of elective courses that students needed the basic fundamentals of networking and concluded that the mandatory CS353, already taught in Fall 2005, needed to be enhanced with an introduction to networks that hitherto had been covered only in CS453. CS353 was accordingly enhanced when taught in Fall 2006. In Spring 2007, when faculty discussed the results of assessment and the state of the program, it noted that there was an unnecessary overlap between CS353 and CS453: both covered an introduction to networks. It was decided that when CS453 is taught next time, it would be devoted exclusively to more advanced topics in networks.




  • In 2005, assessment of CS122 indicated that many entering students had inadequate experience with tools and programming environment (editors, compilers, and the Linux operating system). After discussion among the instructors and coordinator of CS111 and CS122, and later with other faculty, CS111 was strengthened to make sure that those students who started with little or no background in programming and programming environments did get the opportunity to learn the basic tools and techniques needed. The results from our more recent formal assessment process (Table 1.22, Page 33) indicates that this appears to have been successful.


Non-assessment-based changes :


  • Since 2002, a number of courses have been added on security: Secure Software Construction (CS328), Information Assurance (CS464), and Digital Forensics (CS489 Special Topics). While these were introduced prior to the formal assessment process, they benefit the undergraduate program by providing knowledge and skills on computer security and making the students more employable.




  • In Fall 2006, the following changes were made to the curriculum:




  • the addition of a breadth requirement: three hours of electives from Education, Fine Arts, Humanities, Management, Philosophy, Social Science, or Technical Communication; the aim was to impart a broader background to our graduates;




  • the addition of a statistics requirement: with the cooperation of the Mathematics department, the required probability course MATH382 has been enhanced with a 1-hour statistics component MATH382L; our motivation was to satisfy several government and industry employers of our students who demanded exposure to both probability and statistics.




  • a change in an institute-wide requirement counting computing laboratories towards the eight hours of laboratory work that students of all majors must take under the General Degree Requirements; this gave our students more flexibility and achieved the above two modifications without infringing unduly upon the set of free electives. Note that this laboratory work is in addition to two semesters each of physics and chemistry.




  • Changes in the elective course structure:




  • In 2003, faculty decided to group the technical electives under suggested Elective Tracks in order to encourage students to gather their electives under a descriptive umbrella and underscore the need for focus while choosing electives. In Fall 2006, it was realized that although a range of elective courses was offered each year, they did not necessarily provide a consistent sequence for the suggested tracks. Consequently, there was a revision of the elective structure. Commencing with the 2007-09 catalog, the suggested tracks have been removed. Students must still choose 12 hours of CS3xx or higher courses as technical electives. These must still be pre-approved by the student’s advisor and the Associate Chair for Undergraduate Affairs (the previous title was Undergraduate Advisor). The only additional requirement is that a form listing this sequence be filled out; the intent is that a student and his/her advisor will meet and put together a set of technical electives that will prepare the student for a desired career path and give faculty notice about technical elective courses that need to be scheduled.



Future improvements:


  • As a result of assessment of courses at the end of 2006, it was realized that there was an urgent need to improve the Discrete Mathematics course. Since this course is taught by the Mathematics department, the current Mathematics Department Chair was invited to attend a meeting with the department faculty in Spring 2007. The incoming Mathematics Department Chair has also been informed of the faculty’s concern and recommendations and he is committed to work with the CS department to improve the course. As mentioned earlier, one course of action being considered is an adjustment of the syllabus in that course to provide greater stress to the fundamentals needed for the Computer Science courses; another action could be the inclusion of a set of questions in the final examination of that course covering just those fundamentals that the students must pass in order to pass the course. A parallel track being pondered by the Computer Science faculty is the requirement that students not only must pass the Discrete Mathematics course, i.e., get a grade of D or better, but obtain a certain minimum grade in order to satisfy the pre-requisite for CS342 and CS344.






  1. Program Evolution

1. Describe in what respect, if at all, the philosophy and direction of the computer science program has changed at your institution during the last five years, or since the last accreditation visit, whichever is the more recent.




  • In December 2005, it was decided to seek ABET accreditation and routine assessment of the B.S. program was initiated.




  • In general philosophy and direction, the most significant change is that the CS faculty has become more attentive to and receptive of the CS curricula of other — especially Ph.D. granting and peer — institutions. This resulted in more frequent curriculum discussions and revision. As a consequence, the number of new courses developed in the last five years is significantly higher than during the preceding ten years.



2. Describe any major developments and/or progress made in connection with the program in the last five years, or since the last accreditation visit, whichever is the more recent, that is not included in your response to Question I.C.





  • In 2002, New Mexico Tech achieved NSA designation as a National Center of Academic Excellence in Information Assurance Education (CAE/IAE; now jointly administered by NSA and DHS); and Computer Science is the home department of the CAE/IAE. Since becoming a CAE/IAE, several courses on the IA (information assurance) topic have been developed and taught, on regular or one-time basis (as special topics).




  • As a qualified CAE/IAE, an NSF SFS (Scholarship for service) program, funded by the National Science Foundation, was established at New Mexico Tech. So far, this program has provided full scholarship (for up to two years) to fifteen CS majors over four years, providing them training in Information Assurance and placement in government jobs upon graduation.




  • Closely related to the CAE/IAE is the establishment of the university’s ICASA division in 2001. Several undergraduate students have been placed at ICASA as interns, and most of them stayed to become full-time employees after graduation.



  1. Program Current Status

1. List the strengths of the unit offering the computer science program.





  • Highly-qualified research faculty and adjunct faculty with real-world experience;




  • Strong curriculum designed to balance theory and applications;




  • Very strong information assurance curriculum




  • Great scholarship opportunities including NSF SFS and DoD IASP programs;




  • Opportunities for highly individualized instruction and research participation;




  • Great resources and availability of student support;




  • Close proximity to Sandia National Laboratories allows students to have access to Sandia staff and projects therein as interns;




  • Low cost and high-quality education combine to make the university a consistently-ranked college best buy; and




  • Excellent track record in student placement.

2. List any weaknesses or limitations of the institution or unit offering the computer science program.




The small faculty size limits the number of research areas where leading-edge expertise can be developed as well as the variety of electives that can be offered.

Location of the university and its science/technology/engineering emphasis limits opportunities for an enriched social life.

3. List any significant plans for future development of the program.





Under consideration are:


  • Development of computer engineering courses for the enrichment of the B.S. curriculum.




  • Hiring an instructor or faculty member with primarily undergraduate teaching and advising responsibilities.




  • Promotion of student organizations and faculty-student interaction to ameliorate social life.

  1. Student Support


Intent: Students can complete the program in a reasonable amount of time. Students have ample opportunity to interact with their instructors. Students are offered timely guidance and advice about the program’s requirements and their career alternatives. Students who graduate the program meet all program requirements.


Standard II-1. Courses must be offered with sufficient frequency for students to complete the program in a timely manner




  1   2   3   4   5


The database is protected by copyright ©sckool.org 2016
send message

    Main page