Progress on Student Learning Outcomes Assessment – 2008-2012



Download 78.51 Kb.
Date09.11.2016
Size78.51 Kb.
#1169

Draft October 12, 2011


Office for Institutional Effectiveness
Building a culture of evidence, supporting improvement and innovation,
raising resources, and reaching for the highest

Progress on Student Learning Outcomes Assessment – 2008-2012
In March 2008, the College adopted a plan for assessing student learning outcomes in programs (http://ofie.kcc.hawaii.edu/images/stories/Proposed_SLO_Assessment_framework_030508.pdf) (Standard II.A.2.a).

In November 2010, the College adopted a plan for assessing student learning in courses (http://ofie.kcc.hawaii.edu/images/stories/FS_Course_Level_Assessment_Plan_11_8_2010_dlv_final.pdf) (Standard II.A.2.a).

Students Services is included in the program assessment plan. (Standard II.B.4). The counseling staff attended a two-day Assessment Academy on outcomes-based assessment in spring 2009. A follow-up training took place in spring 2010 to track the progress made by the individual counseling clusters. The counselors adopted the term student development outcomes (SDO) to describe their learning goals for students and divided eight teams covering all student services areas (Health, CTE, Career and Transfer Center, Targeted populations, first-year experience, student engagement, International center, and Kahikoluamea). Each team has been working on assessment planning and data collection since spring 2009 and are in the process of using data for improvement of student progression through their degree pathway (Standard II.B.4.)

The counselors are using practical assessments such as questionnaires, surveys, observations, and counselor notes. Groups are also involved in the establishing of rubrics. The college is also relying heavily on the results of the nationally-benchmarked CCSSE surveys to collect data on the main student services functions such as career counseling, advising, and learning support. To date, all teams have created SDOs, rubrics, and program matrices and used the data to improve strategies. Two of the programs have not begun this process due to the college’s recent reorganization which realigned the counseling units involving changes in staffing assignments.



In spring 2008 and in 2008-09, the Student Learning Outcomes Assessment Coordinator offered workshops on the following topics:

    • Introduction to Kapi’olani’s Program Level Assessment plan

    • How to Write Student Learning Outcomes (department specific workshops)

    • Drafting Program/Course Alignment Grids (department specific workshops)

    • What is Outcomes Based Education?

    • Introduction to Rubrics

    • Using Direct and Indirect Evidence

In 2009-2010, the Student Learning Outcomes Assessment Coordinator offered workshops on the following topics:

    • How to Write Student Learning Outcomes

    • Difference between SLOs and Competencies

    • Introduction to Kapi’olani’s Program Level Assessment plan

    • Authentic Assessment

    • Rubric Development

    • What is Outcomes Based Education?

    • Incorporating Assessment Data into Contract Renewals and Tenure/Promotion Documents

In 2010-2011, the assessment coordinator offered workshops on the following topics:

    • Difference between assessment and grading

    • Writing an assessment plan

    • Developing rubrics

    • How to analyze data

    • How to make changes based on assessment data

    • Assessment and WASC – What are the expectations

    • Incorporating Assessment Data into Contract Renewals and Tenure/Promotion

In fall 2011 (thus far), the assessment coordinator offered assessment workshops on the following topics:

  • Incorporating Assessment Data into Contract Renewals and Tenure/Promotion Documents

  • Assessment and WASC: What are the expectations? (for Kahikoluamea and ENG 100 faculty)

In January 2011 and September 2011, the assessment coordinator offered course level assessment training sessions for all lead assessment faculty. The training coincided with the implementation of the course level assessment plan. The Vice Chancellor for Academic affairs offered three workshops on assessment in spring 2011, and the assessment coordinator offered a comprehensive workshop for department chairs in summer 2011. The assessment coordinator also works with program coordinators and lead assessment faculty on assessment issues including developing tools, such as surveys, rubrics, signature assignments, and embedded essays, data analysis and report writing, and developing and revising student learning outcomes statements. The assessment coordinator developed a comprehensive assessment site on Laulima to assist faculty with assessment issues (https://laulima.hawaii.edu/portal/site/101fa374-d034-4676-8f07-7815cbe09f55/page/0e2ac094-39ec-4dff-a343-9a9fb92b5ad9). The site contains assessment articles, links to websites, sample assessment tools and reports, and more in depth information on assessment topics including rubric design and analyzing data. The assessment coordinator created a program assessment manual. The manual is on the KCC Assessment Laulima site and was sent out to program coordinators in spring 2010 with the assessment report template. Finally, the Assessment coordinator provided a workshop on SLO development and assessment to Continuing Education Coordinators in summer 2011(Standard II.A.2.i).
All recruitment advertisements for new faculty include statements that specify faculty roles and responsibilities in learning outcomes assessment. This language reads: “Under general supervision, design, deliver, and assess instruction in [discipline or disciplines] in terms of student-learning outcomes; develop and/or update course content and materials and teaching and assessment strategies and methods to 1) improve student attainment of learning outcomes…”
In working through the development of outcomes and assessment instruments, the faculty and the assessment coordinator align evaluation methods with outcomes and design assessment rubrics that reflect reasonable levels of attainment Standard III.A.1.c
In fall 2011, OFIE administered a campus-wide survey to faculty and staff. The survey had an overall response rate of 50.1 percent. The number of responses varied by question, and the percentage of “Don’t Know” responses also varied. In this survey, 312 to 314 individuals answered five questions related to the College’s mission (number/response count in parenthesis):


  • 91.0 percent (284/312) strongly or somewhat agreed that they were committed to improving the effectiveness of their educational/professional practice to improve student learning and success.

  • 86.3 percent (271/314) strongly or somewhat agreed that the mission statement expresses the college-wide commitment to learning.

  • 57.4 percent (179/312) strongly or somewhat agreed that they used data, program review data, or other institutional assessment data to help their department or unit to identify areas of improvement.

  • 51.8 percent (162/313) strongly or somewhat agreed that they participated actively in the planning or priority-setting process in their department.

  • 47.8 percent (150/314) strongly or somewhat agreed that they have discussed the relevance of the mission statement to student learning with peers or administrators.

In this same survey approximately three out of four faculty (N=200) reported that:



  1. they used student learning assessment results to address weak areas of student learning.

  2. student learning assessment results are a great guide to improve their teaching.

  3. they actively engage in student learning outcomes assessment.

  4. their course competencies are clearly aligned with program learning outcomes.

  5. they had participated in student learning outcomes assessment.

  6. they were willing to work with their colleagues on student learning outcomes assessment.

  7. they would be more willing to do student learning outcomes assessment if examples are available for them to adopt.

  8. they know where to find assistance in developing student learning outcomes assessment.

  9. they see the value in student learning outcome assessment.

As a result of developments since 2009, the college has moved to integrate degree, program, and course learning assessments into the fall 2011 Annual Review of Program Data process. Through this integration, learning assessment will be woven into three year comprehensive program review, three-year tactical planning for improvement in 2009-12 and 2012-15, and inform the next round of strategic planning and mission development in 2014-15.



Assessment of Student Learning in Programs

Career and Technical Education

The College convenes advisory councils and other groups of professionals to review campus programs and recommend changes and improvements to make the programs relevant to the needs of the contemporary workplace. In the Career and Technical Education (CTE) programs, results on licensure exams and dialog with industry advisors ensures high quality and timely assessment and development of improvements in pedagogy, curriculum, and program design.

CTE program learning outcomes are also aligned with professional accrediting agencies to assure that national standards are met. The following programs have aligned their learning outcomes with professional accrediting and thus national standards: Respiratory Therapy Assistant, Radiologic Technology, Occupational Therapy Assistant, Medical Assistant, Physical Therapy Assistant, Nursing, other Health programs, Culinary Arts, Paralegal, and Hospitality Education.

In June, 2011 annual program reporting to ACCJC/WASC, the college reported the following licensure exam pass rates for the 2009-2010 academic year:

AS Nursing (ADN): 100 percent

AS Nursing (LPN-RN): 100 percent

Practical Nursing (PN): 100 percent

Radiologic Technician: 100 percent

Respiratory Care: 100 percent

Occupational Therapy Assistant: 100 percent

Exercise and Sports Science: 100 percent

Medical Assisting: 56.3 percent


The most recent data on Nursing licensure exams for academic year 2010-11 indicates that KCC RN pass rate was 92.0 percent and the PN pass rates was 100 percent. The national average for this latter rate was 85.0 percent. Additionally, the nursing faculty work with the Assessment Technology Institute (ATI) which provides practice assessment testing, computerized case scenario exercises and written resources. At the completion of major content areas including fundamental principles and skills, medical surgical nursing, maternity nursing care, pediatric nursing care, psychiatric and mental health nursing, pharmacology, and leadership nursing students are required to pass computerized assessment tests. After each test, nursing faculty analyze the results and revise curriculum and instructional methods. Results, improvements, and testing issues including establishing benchmarks are discussed and voted on in department meetings. ATI also assists the Nursing department in aggregating data over several semesters so the nursing faculty can identify and respond to trends. The Nursing comprehensive assessment testing results have remained above the national mean for the past two years.
The Respiratory Care Program uses two different credentialing exams and employer and student surveys that are aligned with the program learning outcomes to monitor program quality. In a 2010 employer survey, 100% of respondents rated graduates above the benchmark for performance, and 100% of employers indicated graduates were satisfactory relative to professional behavior, communication skills, and multicultural knowledge. In addition to the didactic and clinical courses they provide in the program, the faculty offer exam preparation workshops to help prepare students for the credentialing exams.
The Occupational Therapy Assistant program uses the Fieldwork Performance Evaluation (FWPE) for the Occupational Therapy Assistant Student (AOTA) to measure attainment of program learning outcomes and monitor program quality. In 2010, the program faculty analyzed results and although 85% -100% of students were meeting or exceeding the benchmark on FWPE, the faculty implemented a practice exam into 294L, Professional Concepts Lab, and incorporated more NBCOT sample test questions into their exams for the didactic courses. They are monitoring the effect of these changes on student learning. The faculty are also creating curriculum around communication skills (program learning outcome #4) to address 29% of students who were below the benchmark. OTA are also making programmatic improvements based on their ARPD reports and other achievement outcome data which indicated a problem with attrition.
The Physical Therapy Assistant program uses clinical internship and evaluations, course assessments, and a verbal exit survey to assess program learning outcomes and monitor program quality. Currently, PTA students are not required to pass a licensure exam to practice in Hawaii. PTA faculty require students to score a 3 or higher on all clinical evaluations. 2010 data indicated that 100% of students were meeting or exceeding the benchmark for the clinical evaluations. To strengthen the use of course assessments for program evaluation, the PTA faculty are in the process of drafting rubrics for the major course assessments that are aligned with the program learning outcomes.
The Radiologic Technology program uses a national certification exam given by the American Registry of Radiologic Technologists. The national exam assesses the knowledge and cognitive skills required of an entry-level radiographer. The major content areas of the exam include radiation protection, equipment operation and quality control, image production and evaluation, radiographic procedures and patient care and education. The program learning outcomes are aligned with these content areas. The program’s first time pass rate average from 2007-2011 was 100% compared to the national average of 91.6%. The program’s average cohort scores from 2007-2011 was 90.4 compared to the average national score of 84.8.
The Medical Assisting Program (MEDA) works closely with the Medical Assisting Education Review Board (MAERB) which has established thresholds for outcome assessment in medical assisting programs accredited by the Commission on Accreditation of Allied Health Education Programs (CAAHEP). These outcomes are mandated as part of the 2008 Standards and Guidelines for Accreditation of Educational Programs in Medical Assisting. They are monitored annually through the MAERB Annual Report. One of the outcomes is National Credentialing Success Rate (CMA(AAMA) or RMA(AMT), > 70% Effective 2009 Grads. If a program has 100 graduates within the 5-year reporting period beginning in 2009, at least 70 of those 100 would need to become credentialed as a CMA(AAMA) or RMA(AMT).
Currently, Medical Assistants in the State of Hawaii are not required by law to be credentialed to work in this state. However, after consulting with the Medical Assisting Program Advisory Committee and the recent MAERB /CAAHEP Accreditation Site Team Surveyors, both groups advised mandatory participation in the credentialing exam prior to graduation because it is related to a required threshold for program accreditation. Medical Assisting Program faculty will be implementing mandatory credentialing examination starting in Spring (AS degree students) and Summer (CA students) 2012. Included in these semesters will be a comprehensive subject review and examination preparation and strategies based on individual student learning assessments. Starting in 2013, mandatory examination will be part of the Summer requirements since all students completing the first year curriculum (Certificate of Achievement program) are eligible to take the national examination. Results of the certification examination will be analyzed and used to ensure program quality as well as make improvements to the MEDA program’s pedagogy, curriculum, and program design.

In the Culinary Arts Department, faculty are using practical exams to assess student achievement of two program learning outcomes. They also use real world, authentic assessment in their Culinary labs and compete in the American Culinary Federation’s (AFC) annual student competition which embodies the AFC’s high standards. KCC won the national championship in 2009, and most recently, won a gold medal in the regional competition. The program is aligned with the AFC’s standards. The program faculty also meet regularly with the advisory board council to ensure that the knowledge and skills taught in its curriculum are relevant to the needs of the contemporary workforce. The program most recently worked with its advisory board council and other industry professionals and consultants to develop an Advanced Professional Certificate in Culinary Management (APC). The program faculty have completed a cycle of assessment and are implementing pedagogical, curricular, or programmatic improvements (add examples here).

Hospitality Education faculty analyzed Internship Supervisor evaluations of student performance and a student survey that corresponded to the Supervisor evaluation form. Hospitality has completed a cycle of assessment, and have made curricular changes based on assessment and alignment of the courses with the program learning outcomes. The program also meets regularly internally and with its advisory council to discuss the program learning outcomes and ensure the program is addressing the industry standards. A significant change made using learning outcomes data, ARPD data, industry input and internal faculty discussions was to merge the Hospitality and Tourism tracks into one program.

The Paralegal Program has been assessing its Program Learning Outcomes since 2008. After faculty and staff dialog and with input from community advisors, they reduced the number of program learning outcomes from seven to six. Each semester at a faculty meeting one Program SLO is reviewed by the faculty. Faculty members discuss which course it applies to and how this is assessed in those courses. The program coordinator then collects three samples from two or three of the courses as evidence that that SLO is being taught and assessed in the program. The program is on track to review the sixth SLO in Fall 2011 and will use the results to improve pedagogy, curriculum, and/or program design.

Within each CTE program, faculty members are developing methods to assess the degree to which students are achieving program learning outcomes. In Information Technology faculty are using rubrics that measure: a) analysis and solution design; b) creation of appropriate user interface; c) connection of front end to backend databases; and d) appropriate program documentation. In Paralegal Education, faculty are evaluating analytical reports and exam questions. IT, ICS, and Accounting programs have been successful in using Perkins fund to obtain personal computers and e-tablets to update pedagogy for improved student learning. Marketing faculty are currently employing marketing plans as assignments to assess student abilities to integrate marketing tools and techniques.

Twenty-two CTE programs track Perkins Performance Indicators and the college was the only one in the UHCC system to exceed all six performance standards in 2009-10. On the Tech Skills Attainment standard (Students with GPAs of 2.0 or higher who have stopped program participation in the year reported/all students who have stopped program participation in the year reported) these 22 programs had an average score of 96.3 percent, with the lowest percentage being in Information Technology (83.3%) and twelve programs at 100 percent.



Arts and Sciences

The Arts and Sciences cluster is responsible for the general education program, and five associate degree programs:


1) Liberal Arts/Associate in Arts (AA)

2) Associate in Science, Natural Science (ASNS, STEM Transfer Degree) with two concentrations;

3) Associate in Science in New Media Arts with two concentrations

4) Associate in Science, Educational Paraprofessional with three concentrations



5) Associate in Science, Interpreting, Educational
Arts and Sciences faculty used the AAC&U Essential Learning Outcomes and the ACCJC/WASC learning outcomes (Standard II.A.3.b and c) to revise General Education program learning outcomes in 2009. These outcomes also serve as Liberal Arts/AA degree and institutional outcomes.
1. Thinking/Inquiry - Make effective decisions with intellectual integrity to problems and/or achieve goals utilizing the skills of critical thinking, creative thinking, information literacy, and quantitative/symbolic reasoning.
2. Communication - Ethically compose and convey creative and critical perspectives to an intended audience using visual, oral, written, social, and other forms of communication.
3. Self and Community/Diversity of Human Experience - Evaluate one's own ethics and traditions in relation to those of other peoples and embrace the diversity of human experience while actively engaging in local, regional, and global communities.

4. Aesthetic Engagement - Through various modes of inquiry, demonstrate how aesthetics engage the human experience, revealing the interconnectedness of knowledge and life.

5. Integrative Learning - Explore and synthesize knowledge, attitudes and skills from a variety of cultural and academic perspectives to enhance our local and global communities.
Arts and Sciences courses have course competencies which have been aligned with these outcomes (See Grids, url). Some Arts and Sciences faculty have also used the AAC&U VALUE rubrics to assess the levels at which course competencies and program learning outcomes are being achieved by students.
The Liberal Arts program has not yet identified student learning strengths and weaknesses but a robust set of assessment strategies are currently being piloted. These include the identification of "cornerstone" courses. The cornerstone project is an opportunity for students to demonstrate that they have achieved the general education/liberal arts program learning outcomes. The cornerstone requires the application of learning in a project which serves as an artifact for assessment. In spring 2011, faculty working on the cornerstone project assessed 30 learning artifacts using the AAC&U Value rubrics. The initial assessment indicated students were below the benchmark in critical thinking and meeting or exceeding the benchmark for integrated learning. Faculty are working to expand the cornerstone project. Faculty are discussing possible improvements in pedagogy, curriculum, and program design, and are working on completing a cycle of assessment by June 2012.
The College is developing an e-portfolio tool which is now under contract with a software developer. It will be ready for implementation in December 2011 and will be one component of a student pathway portfolio. The other component, currently being developed by the same vendor is ‘Imiloa which focuses on student support networks and available resources for their academic success Standard II.A.3.b.
Service-Learning Emphasis faculty have completed one cycle of student learning outcomes assessment and are revising pedagogies and student supports to improve student learning. These faculty developed prompt questions that are aligned with the new general education learning outcomes. Service-learning students are required to address these in end-of-semester capstone essays. Faculty selected a 10 percent random sample of student essays from fall 2010 and spring 2011 and used a rubric to score the essays and identify student learning strengths and weaknesses. As a result of this assessment, 18 improvement strategies for faculty and 4 improvement strategies for students were developed. A second cycle of service-learning outcomes assessment will be completed in April and July 2012. (See Service-Learning Outcomes Report at OFIE, http://ofie.kcc.hawaii.edu/images/stories/Service-Learning_SLO_Assessment_2010-2011.pdf).
Service-Learning is a robust across the curriculum emphasis impacting 660 students per year. For more than a decade, service-learning has garnered real world, authentic assessment data on student performance indicators from community-based (K-12 and non-profit) supervisors, as well as faculty who supervise students in service-learning activities, such as Malama i na Ahupu'a and second language learning. These student performance indicators are:


    • Reliability (Worked when scheduled, punctual)

    • Sensitivity to Others (Including clients, customers, staff, and other volunteers)(Self and Community General Education learning outcome)

    • Willingness to Learn (Could appropriately receive feedback and information)(Communication General Education Outcome)

    • Communication Skills (Could communicate effectively with clients/supervisor to complete tasks

    • Overall performance

In 2010-11, as KCC staff and faculty were assessing student learning via more focused, often student-led reflection sessions, and capstone essays, the Service-Learning program added a sixth performance indicator, commitment to an organization/project’s mission.

In 2009 and 2010, Service-Learning students had course success and fall-to-spring persistence rates that were 18 percentage points higher than these rates for all students. The college is continuing to explore the development of a Service-Learning general education requirement, or concentration, as this pedagogy has been shown in both campus and national assessments to improve student achievement of general education learning outcomes (Standard II.A.3.c)


Faculty teaching Writing Intensive (WI) courses, which are required for the Liberal Arts degree, also used the AAC&U VALUE rubrics to assess student writing. They have submitted an assessment report and are making improvements in teaching citation conventions with assistance from library staff. Several assessment projects including English 22, English 100, WI, and library have shown that students are below the benchmark in evaluating sources. English 22 has partnered with library to develop a library research project. English 100 and WI faculty are still discussing ways to partner with the library to strengthen students’ learning in this area.
Faculty teaching in the new Associate in Science/Natural Science degree program have aligned their learning outcomes with the new general education learning outcomes. An ASNS Assessment Rubric was approved by Math Science faculty in December 2009. In May, 2011, 13 Math Science faculty, including the department chair, work with three faculty facilitators, to assess 47 learning artifacts. Faculty identified strengths and weaknesses in student learning and identified 14 topics for further consideration in improving student learning and the assessment process (See FIRE-UP Institute 2011, ASNS Program Learning Outcomes Assessment, (http://ofie.kcc.hawaii.edu/index.php?option=com_content&view=article&id=52&Itemid=66).
Many of the artifacts reviewed including the STEM student research posters have been selected in competitions for presentations at national undergraduate research conferences including the National Conference on Undergraduate Research, NSF JAMS, Hawaii EPScoR, and Society for Advancement of Chicanos/Latinos and Native American in Science.
Faculty in the New Media Arts (NMA) program use digital capstone portfolios, demo reels, and practicum projects, to assess five student learning outcomes. They have completed three cycles of learning outcomes assessment and have made pedagogical, curricular, and program design improvements as a result. The formal program SLO assessment process is one of several helpful tools for reflection and program improvement. It coincides with course-level assessment, monthly NMA program meetings, NMA curriculum sub-committees, NMA surveys and feedback collected from NMA students, and input from our NMA Advisory Board.
In 2010, NMA updated the AS degree in both animation and interface design by removing five courses, rearranging the sequence of two existing courses, and adding two new courses: ART 284 Animation Studio and ART 285 Interface Design Studio. These two new courses embrace the studio-based learning model (SBL) to address several issues that were discovered during the assessment process, notably a) the need for more depth into specific NMA-related topics, b) the need for students to become independent self-learners to succeed after graduation, c) the need for students to gain more experience going through the full creative process in order to create industry quality portfolio and demo-reel products, and d) the need to work in collaborative team environments to develop strong inter-personal communication skills needed to succeed in the industry/workplace.
The faculty and staff in the Library and Learning Resources (LLR) unit have developed student learning outcomes that align with Association of College & Research Libraries Information Literacy Competency Standards for Higher Education (ACRL) and the college’s general education learning outcomes. LLR faculty and staff have completed one cycle of learning assessment and as a result have they worked to increase instructional faculty-librarian collaboration and the number of instructional sessions about access, evaluation, and citation.  One example of this collaboration is between Library and ENG 22 faculty to develop a two-day-long research activity that can be integrated in ENG 22 and 100 classes.  This activity was tested in summer 2011 and is currently being implemented in most ENG 22 and select ENG 100 face-to-face classes.

Developmental Education

Faculty and staff in Kahikoluamea, the Developmental Education unit, have been assessing student achievement and learning outcomes data and are implementing three improvement strategies as a result. They are:

1) piloting an online academic and career planning tool called MyPlan in selected English 22 courses. The MyPlan is a guide to help students to 1) identify their strengths and interests, 2) set goals, 3) learn how to make an academic plan, 4) understand how courses relate to educational goals and a chosen career, 5) identify academic and student support services and resources based on their needs, and 6) engage with instructional faculty, counselors, advisors, and peer mentors to enhance their learning. The ENG 22 faculty, Pathways coordinator, and Assessment coordinator meet to discuss assessment issues. Through these meetings they have revised the MyPlan SLOs, drafted rubrics, discussed ways to make MyPlan a more effective tool, and at the end of fall 2011, will assess student work from their MyPlan;

2) redesigning two pathways of math courses, one is a STEM Path from Math 82 to Math 103, and will feature self-paced learning rather than lectures, and the other is Quant Path which will offer contextualized and computer assisted learning models. The goal is to have all developmental students complete a College Math course in their first year which is defined as their first 20 credits. The math faculty meet weekly to discuss the redesign in the STEM Path and Quant Path. During these weekly discussions, they discuss the results of student assessments and modify the curricular modules and materials to ensure improved learning. The spring 2011 success rate1 for the initial STEM Path redesign was 64% compared to 45% success rate in the traditional model;

3) redesigning the writing pathway to include Accelerated Learning Program (ALP) models for ENG 22 and ENG 100. The goal is for students qualifying for ENG 22 will to complete ENG 22 and ENG 100 in one semester. The writing faculty meet monthly to discuss learning outcomes data and achievement data. In an attempt, to minimize high attrition rates faculty began experimenting with two different accelerated learning models. In fall 2010, 76% of developmental students enrolled in the ALP courses (ENG 22 and ENG 100) successfully completed ENG 100, compared with a success rate of 66% for ENG 100 students overall. In Spring 2011, the ENG100 success rates were 75% for ALP students, and 56% overall. Standard II.A.1.a

Assessment of Student Learning in Courses

The ACCJC requires student learning assessment for all courses and courses need to have completed an assessment cycle by fall 2012, and the College approved a course level assessment plan in December 2010 (http://ofie.kcc.hawaii.edu/images/stories/FS_Course_Level_Assessment_Plan_11_8_2010_dlv_final.pdf).

The recent survey of faculty (N=215) documents that they use a wide variety of assessment methods in their classes. In descending order of frequency, faculty use tests, written papers or reports, quizzes, oral presentations and interviews, demonstrations and performances, written portfolios and e-portfolios, and multimedia to assess the competency levels of their students, and they assign grades based on these assessments.

In July 2011, a Course Assessment Scorecard was developed by the assessment coordinator (http://ofie.kcc.hawaii.edu/images/stories/Course_Assessment_Scorecard_9-11.pdf). Program and discipline coordinators were assigned to lead faculty in their disciplines through the assessment process. Three priority levels were developed to expedite course level learning assessment:



  • Priority 1 courses are those with six or more sections offered in a semester

  • Priority 2 courses are those with two to five sections of a course offered in a semester

  • Priority 3 courses are those with only one section of a course offered in a semester

In 2011-2012, programs and disciplines are working with the assessment coordinator to complete an assessment cycle for priority level 1 and 2 courses, and then priority 3 courses. Some priority 3 courses need to complete learning assessments due to program accreditation requirements.

Program and discipline coordinators and faculty can choose from the following four options to develop their assessments of student learning in courses:
Option 1


  1. Each instructor evaluates his/her students’ work using agreed upon criteria (rubric). The assessments/assignments that are being scored are aligned with the specific course competencies that are being measured that year.

  2. Instructors summarize data and forward data to the program/discipline assessment coordinator. The program/discipline assessment coordinator aggregates the data.

  3. Instructors meet to discuss results and possible pedagogical, curricular, and programmatic revisions.

  4. Program/discipline assessment coordinator completes and submits an assessment report that includes the action(s) that will be taken to improve student learning.


Option 2

  1. Each instructor embeds a signature assignment or questions that are designed to measure specific competencies. Signature assignments or embedded questions are collected and scored by individual faculty using agreed upon criteria (rubric).

  2. Instructors forward scores to program/discipline assessment coordinator. Program/discipline assessment coordinators aggregates the data.

  3. Instructors meet to discuss the results and possible pedagogical, curricular and programmatic revisions.

  1. Program/discipline assessment coordinator completes and submits assessment report that includes the action(s) that will be taken to improve student learning.


Option 3

  1. Each instructor summarizes his/her students’ results on the target competency(s) being assessed, using his or her own criteria.

  2. Instructors meet to discuss the results and must determine commonalities and a reliable way to compare and contrast the information into a cohesive conclusion.

  3. Discuss possible pedagogical, curricular, and programmatic revisions based in the results.

  4. Program/discipline assessment coordinator completes and submits an assessment report that includes the action(s) that will be taken to improve student learning.


Option 4 (if used, must be used in combination with Option 1, 2, or 3)

  1. Each instructor gives a student survey that has agreed upon questions (SALG or other survey can be used). Survey results should be aggregated.

  2. Instructors meet to discuss results

  3. Because surveys are indirect evidence of student learning, they can be used to validate the direct evidence collected in options 1, 2, and 3.

  4. Program/discipline assessment coordinator includes information from student surveys on the assessment report.

Student learning is being assessed in the following CTE courses (number of courses in parenthesis): Accounting (3), Information Technology (9), Accounting (3), Paralegal Education (1), Marketing (1), Other Business (3), Culinary Arts and Food Service (11), Mobile Intensive Care Technician (1), Medical Assisting (1), Medical Laboratory Technician (1), Occupational Therapy Assistant (11), Physical Therapy Assistant (1), Radiology Technician (1), Respiratory Care Technician (11), Nursing (2), Exercise and Sports Science (1), Other Health (3), and Hospitality Education (6). Eight of these CTE courses are Priority 1, 29 are Priority 2, and 23 are Priority 3. Four CTE faculty are using Option 1, one is using Option 2, and 4 are using Option 3.


Respiratory Care Technician program has completed an assessment for each of the courses offered in the program. Occupational Therapy Assistant Program also completed assessments for almost all of its courses. As part of the assessment, OTA faculty aligned the course learning outcomes to the OTA ACOTE Standards. Both RESP and OTA courses are closely aligned with their program learning outcomes. Fall 2010-11 assessments indicated that students in both programs were meeting or exceeding the benchmarks for the course learning outcomes. Information Technology has begun course level assessment. ICS courses (100, 101, 111, and 211) use published tests to measure student’s attainment of the learning outcomes. They are monitoring the results, and have yet made any pedagogical, curricular or programmatic improvements at this time. Hospitality assessed HOST 100, 101 and 170 in spring 2011. Faculty are discussing was to improve learning for course learning outcomes where students were below the benchmark. The discussions are focused on revising course learning outcomes and creating more engaging assignments to help students to improve learning. HOST faculty are working on assessing HOST 152, 171, 290 and 293 during fall 2011.
Student learning is being assessed in the following Arts and Sciences courses (number of courses in parenthesis): History (2), Religion (1), Speech (3), Theater (3), Dance (5), Music (11), Art (11), Hawaiian Studies (2), Philosophy (3), Pacific Islands Studies (1), Math (7), Biology (12), Botany (3), Chemistry (6), Microbiology (1), Physics (2), Zoology (4); Anthropology (1), Economics (3), Education (3), Family Resources (1), Psychology (3), Sociology 1), Social Science (1), Geography (4), English (10), English for Speakers of Other Languages (4), American Sign Language (3), Chinese (2), French (2), Hawaiian (4), Japanese (5), Korean (1), Spanish (3), Journalism (1), Linguistics (1). Twenty-eight of these courses are Priority 1, 67 are Priority 2, and 27 are Priority 3. Eight Arts and Sciences faculty are using Option 1, 22 are using Option 2, and 10 are using Option 3.
ESOL and speech faculty met on a regular basis during the spring 2011 semester to develop rubrics based on their respective courses and learning outcomes and to discuss the results. The foreign language faculty developed a rubric in spring 2011 and implemented an assessment in August 2011. In summer 2011, the History faculty met to develop signature assignments for HIST 151 and HIST 152. The signature assignments will be used for assessment of course learning outcomes in 2011-12. Some Sociology faculty developed rubrics and embedded questions to assess SOC 100 in 2011-12. ENG 100 faculty are working in small writing groups to create a rubric for pre assigned course learning outcomes. They will be assessing in 2011-12, and adding to the assessment work they started in spring 2011. Hawaiian Studies 107 faculty met in August 2011, to create shared exam questions for assessment in 2011-12, and based on assessment work in spring 2011, HWST 100 was redesigned in summer 2011. Math and Science faculty are working on developing shared questions to use for assessment purposes in 2011-12. Although these disciplines are making progress, they have yet to complete an assessment of student learning that would evidence strengths and weaknesses that could be remedied through pedagogical, curricular or programmatic improvements.

Philosophy 110 faculty found that the students were meeting or exceeding the benchmark for 3 out of 5 learning outcomes. They are continuing to discuss ways to strengthen learning for the other 2 learning outcomes. They are focused on creating ways to close the gap between understanding concepts and application of techniques.


Student learning is being assessed in the following Developmental courses (number of courses in parenthesis): English (2), Math (3). Four of these courses are Priority 1, and 1 is Priority 2. Two faculty are using Option 1 and four are using Option 2.
ENG 22 faculty have been working on assessment for several years. In fall 2010, using the end-of-semester portfolio reading and corresponding assessment rubric based on course SLOs, the English 22 faculty identified specific areas in which students were not achieving scores of “competent” or “strong.” Particularly, English 22 Portfolio Assessment Data compiled from all Fall 2010 English 22 courses indicated that students achieved lower in the Research outcome compared to other identified outcomes, with 17% of students achieving “Not at College Level” in activities associated with research skills, such as annotating, summarizing, and synthesizing information from research sources, compared to 8% in Writing Process, 7% in Essays, 6% in Editing, and 7% in Organization. These results were further validated by the English 22 Portfolio Assessment Data compiled from all Spring 2011 English 22 courses wherein 20.5% of students were identified as being “Not at College Level” in Research compared to 9.1% in Writing Process, 10.2% in Essays, 12.5% in Editing, and 12.5% in Organization. Faculty determined that the lower success rates in the identified areas may be preventing students from successfully completing the developmental writing sequence. As a result, writing and library faculty co-designed the Library Research Activity (prototyped during Summer 2011 in selected English 100 and 22 courses) to promote successful attainment and application of research skills across the writing pipeline.
Based on poor results for course learning outcomes data and achievement data Math 24 and 25 faculty began to rethink its math curriculum. In fall 2010, the Math 24 faculty implemented a course redesign for its Elementary Algebra I course. The goal of the redesign was to 1) encourage students to take an active role in their own learning, build on timely assessment, and utilize faculty guidance; and 2) to move from a seat-time model to one based on subject mastery. In spring 2011, success rates for the redesigned course were 64% compared to 45% in traditionally taught courses. Additionally, the students’ average raw score on the final exam increased from 17.75 to 21.56, and based on survey data, students spent more time working on math problems (from 32-48 hours per week to 35-152 hours per week).
The College is continuing to offer the redesigned Math 24 in fall 2011, but recognizes that one of the goals of the emporium model (what the redesign is based on) is to create mobility within the developmental math sequence. To this end, the College is developing Math 82 which combines curriculum and content from Math 24 and 25 and leads directly into 103, College Algebra. The College will pilot Math 82 in spring 2012 with full implementation in fall 2012.


1 Success is defined as C or higher



Download 78.51 Kb.

Share with your friends:




The database is protected by copyright ©sckool.org 2022
send message

    Main page